sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
4a5009abc37cbb2d3fd1a745f80e5ea1405ba9aa
# Dataset `GSM8K_zh` is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from GSM8K (https://github.com/openai/grade-school-math/tree/master) by `GPT-3.5-Turbo` with few-shot prompting. The dataset consists of 7473 training samples and 1319 testing samples. The former is for **supervised fine-tuning**, while the latter is for **evaluation**. for training samples, `question_zh` and `answer_zh` are question and answer keys, respectively; for testing samples, only the translated questions are provided (`question_zh`). # Citation If you find the `GSM8K_zh` dataset useful for your projects/papers, please cite the following paper. ```bibtex @article{yu2023metamath, title={MetaMath: Bootstrap Your Own Mathematical Questions for Large Language Models}, author={Yu, Longhui and Jiang, Weisen and Shi, Han and Yu, Jincheng and Liu, Zhengying and Zhang, Yu and Kwok, James T and Li, Zhenguo and Weller, Adrian and Liu, Weiyang}, journal={arXiv preprint arXiv:2309.12284}, year={2023} } ```
meta-math/GSM8K_zh
[ "task_categories:question-answering", "size_categories:n<1K", "language:en", "language:zh", "license:mit", "math", "math-qa", "chinese-math-qa", "region:us" ]
2023-12-04T03:08:44+00:00
{"language": ["en", "zh"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "tags": ["math", "math-qa", "chinese-math-qa"]}
2023-12-04T04:02:01+00:00
[]
[ "en", "zh" ]
TAGS #task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #chinese-math-qa #region-us
# Dataset 'GSM8K_zh' is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from GSM8K (URL by 'GPT-3.5-Turbo' with few-shot prompting. The dataset consists of 7473 training samples and 1319 testing samples. The former is for supervised fine-tuning, while the latter is for evaluation. for training samples, 'question_zh' and 'answer_zh' are question and answer keys, respectively; for testing samples, only the translated questions are provided ('question_zh'). If you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper.
[ "# Dataset\n\n'GSM8K_zh' is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from GSM8K (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 7473 training samples and 1319 testing samples. The former is for supervised fine-tuning, while the latter is for evaluation.\n\nfor training samples, 'question_zh' and 'answer_zh' are question and answer keys, respectively;\nfor testing samples, only the translated questions are provided ('question_zh').\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
[ "TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #chinese-math-qa #region-us \n", "# Dataset\n\n'GSM8K_zh' is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from GSM8K (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 7473 training samples and 1319 testing samples. The former is for supervised fine-tuning, while the latter is for evaluation.\n\nfor training samples, 'question_zh' and 'answer_zh' are question and answer keys, respectively;\nfor testing samples, only the translated questions are provided ('question_zh').\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
[ 55, 172 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #chinese-math-qa #region-us \n# Dataset\n\n'GSM8K_zh' is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from GSM8K (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 7473 training samples and 1319 testing samples. The former is for supervised fine-tuning, while the latter is for evaluation.\n\nfor training samples, 'question_zh' and 'answer_zh' are question and answer keys, respectively;\nfor testing samples, only the translated questions are provided ('question_zh').\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
38ee9a74b40f897c2347b5068fda5d7208cac0ce
원본 데이터셋: [databricks/databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k)
nlp-with-deeplearning/ko.databricks-dolly-15k
[ "task_categories:question-answering", "task_categories:summarization", "size_categories:10K<n<100K", "language:en", "language:ko", "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-04T03:12:08+00:00
{"language": ["en", "ko"], "license": "cc-by-nc-sa-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "summarization"]}
2023-12-04T03:27:59+00:00
[]
[ "en", "ko" ]
TAGS #task_categories-question-answering #task_categories-summarization #size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us
원본 데이터셋: databricks/databricks-dolly-15k
[]
[ "TAGS\n#task_categories-question-answering #task_categories-summarization #size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us \n" ]
[ 62 ]
[ "passage: TAGS\n#task_categories-question-answering #task_categories-summarization #size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us \n" ]
9347e6bba50eb20ad36b572f17eeba6d377dd5a1
# Dataset `MetaMathQA_GSM8K_zh` is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from MetaMathQA (https://huggingface.co/datasets/meta-math/MetaMathQA) by `GPT-3.5-Turbo` with few-shot prompting. The dataset consists of 231685 samples. # Citation If you find the `GSM8K_zh` dataset useful for your projects/papers, please cite the following paper. ```bibtex @article{yu2023metamath, title={MetaMath: Bootstrap Your Own Mathematical Questions for Large Language Models}, author={Yu, Longhui and Jiang, Weisen and Shi, Han and Yu, Jincheng and Liu, Zhengying and Zhang, Yu and Kwok, James T and Li, Zhenguo and Weller, Adrian and Liu, Weiyang}, journal={arXiv preprint arXiv:2309.12284}, year={2023} } ```
meta-math/MetaMathQA_GSM8K_zh
[ "task_categories:question-answering", "size_categories:n<1K", "language:en", "language:zh", "license:mit", "math", "math-qa", "region:us" ]
2023-12-04T03:13:47+00:00
{"language": ["en", "zh"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "tags": ["math", "math-qa"]}
2023-12-04T05:23:20+00:00
[]
[ "en", "zh" ]
TAGS #task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #region-us
# Dataset 'MetaMathQA_GSM8K_zh' is a dataset for mathematical reasoning in Chinese, question-answer pairs are translated from MetaMathQA (URL by 'GPT-3.5-Turbo' with few-shot prompting. The dataset consists of 231685 samples. If you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper.
[ "# Dataset\n\n'MetaMathQA_GSM8K_zh' is a dataset for mathematical reasoning in Chinese, \nquestion-answer pairs are translated from MetaMathQA (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 231685 samples.\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
[ "TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #region-us \n", "# Dataset\n\n'MetaMathQA_GSM8K_zh' is a dataset for mathematical reasoning in Chinese, \nquestion-answer pairs are translated from MetaMathQA (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 231685 samples.\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
[ 48, 101 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #language-Chinese #license-mit #math #math-qa #region-us \n# Dataset\n\n'MetaMathQA_GSM8K_zh' is a dataset for mathematical reasoning in Chinese, \nquestion-answer pairs are translated from MetaMathQA (URL by 'GPT-3.5-Turbo' with few-shot prompting.\nThe dataset consists of 231685 samples.\n\n\n\nIf you find the 'GSM8K_zh' dataset useful for your projects/papers, please cite the following paper." ]
dec24968c6e3eea2d2f0691d57797299374ba612
원본 데이터셋: [teknium/openhermes](https://huggingface.co/datasets/teknium/openhermes)
nlp-with-deeplearning/ko.openhermes
[ "task_categories:text-generation", "size_categories:100K<n<1M", "language:en", "language:ko", "license:cc-by-nc-sa-4.0", "distillation", "synthetic data", "gpt", "region:us" ]
2023-12-04T03:14:12+00:00
{"language": ["en", "ko"], "license": "cc-by-nc-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "tags": ["distillation", "synthetic data", "gpt"]}
2023-12-04T03:26:54+00:00
[]
[ "en", "ko" ]
TAGS #task_categories-text-generation #size_categories-100K<n<1M #language-English #language-Korean #license-cc-by-nc-sa-4.0 #distillation #synthetic data #gpt #region-us
원본 데이터셋: teknium/openhermes
[]
[ "TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #language-Korean #license-cc-by-nc-sa-4.0 #distillation #synthetic data #gpt #region-us \n" ]
[ 63 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #language-Korean #license-cc-by-nc-sa-4.0 #distillation #synthetic data #gpt #region-us \n" ]
56303eec1fe0f52eb30a3191a0ec569953f9b34b
원본 데이터셋: [nvidia/HelpSteer](https://huggingface.co/datasets/nvidia/HelpSteer)
nlp-with-deeplearning/Ko.HelpSteer
[ "size_categories:10K<n<100K", "language:en", "language:ko", "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-04T03:18:07+00:00
{"language": ["en", "ko"], "license": "cc-by-nc-sa-4.0", "size_categories": ["10K<n<100K"]}
2023-12-04T03:25:23+00:00
[]
[ "en", "ko" ]
TAGS #size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us
원본 데이터셋: nvidia/HelpSteer
[]
[ "TAGS\n#size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us \n" ]
[ 40 ]
[ "passage: TAGS\n#size_categories-10K<n<100K #language-English #language-Korean #license-cc-by-nc-sa-4.0 #region-us \n" ]
f2a0896a737fb03eb81bc88cc96cc187f73e9c1b
This dataset represents the supervised fine-tuning instructions for HuatuoGPT2, with all responses generated by GPT-4. We have made 50,000 entries of this dataset publicly available. - **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II
FreedomIntelligence/HuatuoGPT2_sft_instruct_GPT4_50K
[ "license:apache-2.0", "region:us" ]
2023-12-04T03:41:42+00:00
{"license": "apache-2.0"}
2023-12-04T11:39:54+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
This dataset represents the supervised fine-tuning instructions for HuatuoGPT2, with all responses generated by GPT-4. We have made 50,000 entries of this dataset publicly available. - Repository: URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
121352c0e164f001cbd2fc29aa6d4ceb23779104
# Dataset Card for "thestack_omp" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
celinelee/thestack_omp
[ "region:us" ]
2023-12-04T03:44:36+00:00
{"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "c", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2214853376.8, "num_examples": 76232}, {"name": "valid", "num_bytes": 276856672.1, "num_examples": 9529}, {"name": "test", "num_bytes": 276856672.1, "num_examples": 9529}], "download_size": 828673457, "dataset_size": 2768566721.0}}
2023-12-04T19:52:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "thestack_omp" More Information needed
[ "# Dataset Card for \"thestack_omp\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"thestack_omp\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"thestack_omp\"\n\nMore Information needed" ]
2efd9081cd858f09e56a44ed3105b3af9cce35d0
# Sea-bench - a benchmark for evaluating chat assistants in Southeast Asian languages. **See interactive benchmark view at [Spaces/SeaLLMs/Sea-bench](https://huggingface.co/spaces/SeaLLMs/Sea-bench)** While there are popular benchmarks to evaluate LLMs as helpful assistants, such as [MT-bench](https://huggingface.co/spaces/lmsys/mt-bench), they are only English-based and likely to be unsuitable to evaluate performances in low-resource languages. Due to this lack of multilingual benchmarks for assistant-style models, we engaged native linguists to build a multilingual test set with instructions that cover 9 Southeast Asian languages, called [Sea-bench](https://huggingface.co/datasets/SeaLLMs/Sea-bench). The linguists sourced such data by manually translating open-source English test sets, collecting real user questions from local forums and websites, collecting real math and reasoning questions from reputable sources, as well as writing test instructions and questions themselves. Our Sea-Bench consists of diverse categories of instructions to evaluate models, as described in the following: * Task-solving: This type of data comprises various text understanding and processing tasks that test the ability of the language model to perform certain NLP tasks such as summarization, translation, etc. * Math-reasoning: This includes math problems and logical reasoning tasks. * General-instruction data: This type of data consists of general user-centric instructions, which evaluate the model's ability in general knowledge and writing. Examples for this type can be requests for recommendations, such as "Suggest three popular books," or instructions that require the model to generate creative outputs, like "Write a short story about a dragon." * NaturalQA: This consists of queries posted by real users, often in popular local forums, involving local contexts or scenarios. The aim is to test the model's capacity to understand and respond coherently to colloquial language, idioms, and locally contextual references. * Safety: This includes both general safety and local context-related safety instructions. The instructions could involve testing the model's understanding of safe practices, its ability to advise on safety rules, or its capacity to respond to safety-related queries. While most general safety questions are translated from open sources, other local context-related safety instructions are written by linguists of each language. Safety data only covers Vietnamese, Indonesian, and Thai. The released Sea-bench test set is a **small subset** that contains **20 questions per task type per language**. Sea-bench can be evaluated similar to MT-bench, using GPT-4 as a judge in a score-based grading metrics and a peer comparison (or pairwise comparison) manner. Instructions to evaluate models on Sea-bench (score-based grading): ```bash # Clone and install LLM judge: https://github.com/lm-sys/FastChat/tree/main/fastchat/llm_judge # Download SeaLLMs/Sea-bench files to folder `Sea-bench` # Copy `Sea-bench` folder to llm `FastChat/fastchat/llm_judge` # Run generation, similar to MT-bench, e.g: python gen_model_answer.py --model-path lmsys/vicuna-7b-v1.5 --model-id vicuna-7b-v1.5 --bench-name Sea-bench # Run LLM judgement python gen_judgment.py \ --parallel 6 \ --bench-name Sea-bench \ --model-list ${YOUR_MODEL_NAME} ``` ## Evaluation results: ![fig_sea_bench_side_by_side.png](fig_sea_bench_side_by_side.png) Sea-bench is used to evaluate [SeaLLMs](https://huggingface.co/SeaLLMs/SeaLLM-Chat-13b), a group of language models built with focus in Southeast Asian languages. ## Contribution If you have Sea-bench evaluations on your model that you would like into the aggregated results. Please kindly submit a pull request with an updated [model_judgment/gpt-4_single.jsonl ](https://huggingface.co/datasets/SeaLLMs/Sea-bench/tree/main/model_judgment/gpt-4_single.jsonl) file. Please use a different model name than the ones indicated in the file. ## Citation If you find our project useful, hope you can star our repo and cite our work as follows. Corresponding Author: [[email protected]](mailto:[email protected]) ``` @article{damonlpsg2023seallm, author = {Xuan-Phi Nguyen*, Wenxuan Zhang*, Xin Li*, Mahani Aljunied*, Qingyu Tan, Liying Cheng, Guanzheng Chen, Yue Deng, Sen Yang, Chaoqun Liu, Hang Zhang, Lidong Bing}, title = {SeaLLMs - Large Language Models for Southeast Asia}, year = 2023, Eprint = {arXiv:2312.00738}, } ```
SeaLLMs/Sea-bench
[ "size_categories:n<1K", "language:en", "language:vi", "language:id", "language:km", "language:th", "language:lo", "language:my", "language:ms", "language:tl", "language:zh", "license:apache-2.0", "region:us" ]
2023-12-04T03:53:06+00:00
{"language": ["en", "vi", "id", "km", "th", "lo", "my", "ms", "tl", "zh"], "license": "apache-2.0", "size_categories": ["n<1K"], "configs": [{"config_name": "Question", "data_files": "question.jsonl"}]}
2023-12-12T13:34:52+00:00
[]
[ "en", "vi", "id", "km", "th", "lo", "my", "ms", "tl", "zh" ]
TAGS #size_categories-n<1K #language-English #language-Vietnamese #language-Indonesian #language-Khmer #language-Thai #language-Lao #language-Burmese #language-Malay (macrolanguage) #language-Tagalog #language-Chinese #license-apache-2.0 #region-us
# Sea-bench - a benchmark for evaluating chat assistants in Southeast Asian languages. See interactive benchmark view at Spaces/SeaLLMs/Sea-bench While there are popular benchmarks to evaluate LLMs as helpful assistants, such as MT-bench, they are only English-based and likely to be unsuitable to evaluate performances in low-resource languages. Due to this lack of multilingual benchmarks for assistant-style models, we engaged native linguists to build a multilingual test set with instructions that cover 9 Southeast Asian languages, called Sea-bench. The linguists sourced such data by manually translating open-source English test sets, collecting real user questions from local forums and websites, collecting real math and reasoning questions from reputable sources, as well as writing test instructions and questions themselves. Our Sea-Bench consists of diverse categories of instructions to evaluate models, as described in the following: * Task-solving: This type of data comprises various text understanding and processing tasks that test the ability of the language model to perform certain NLP tasks such as summarization, translation, etc. * Math-reasoning: This includes math problems and logical reasoning tasks. * General-instruction data: This type of data consists of general user-centric instructions, which evaluate the model's ability in general knowledge and writing. Examples for this type can be requests for recommendations, such as "Suggest three popular books," or instructions that require the model to generate creative outputs, like "Write a short story about a dragon." * NaturalQA: This consists of queries posted by real users, often in popular local forums, involving local contexts or scenarios. The aim is to test the model's capacity to understand and respond coherently to colloquial language, idioms, and locally contextual references. * Safety: This includes both general safety and local context-related safety instructions. The instructions could involve testing the model's understanding of safe practices, its ability to advise on safety rules, or its capacity to respond to safety-related queries. While most general safety questions are translated from open sources, other local context-related safety instructions are written by linguists of each language. Safety data only covers Vietnamese, Indonesian, and Thai. The released Sea-bench test set is a small subset that contains 20 questions per task type per language. Sea-bench can be evaluated similar to MT-bench, using GPT-4 as a judge in a score-based grading metrics and a peer comparison (or pairwise comparison) manner. Instructions to evaluate models on Sea-bench (score-based grading): ## Evaluation results: !fig_sea_bench_side_by_side.png Sea-bench is used to evaluate SeaLLMs, a group of language models built with focus in Southeast Asian languages. ## Contribution If you have Sea-bench evaluations on your model that you would like into the aggregated results. Please kindly submit a pull request with an updated model_judgment/gpt-4_single.jsonl file. Please use a different model name than the ones indicated in the file. If you find our project useful, hope you can star our repo and cite our work as follows. Corresponding Author: l.bing@URL
[ "# Sea-bench - a benchmark for evaluating chat assistants in Southeast Asian languages.\n\nSee interactive benchmark view at Spaces/SeaLLMs/Sea-bench\n\nWhile there are popular benchmarks to evaluate LLMs as helpful assistants, such as MT-bench, \nthey are only English-based and likely to be unsuitable to evaluate performances in low-resource languages. \nDue to this lack of multilingual benchmarks for assistant-style models, we engaged native linguists to build a multilingual test set with instructions that \ncover 9 Southeast Asian languages, called Sea-bench. \nThe linguists sourced such data by manually translating open-source English test sets, collecting real user questions from local forums and websites, \ncollecting real math and reasoning questions from reputable sources, as well as writing test instructions and questions themselves.\n\nOur Sea-Bench consists of diverse categories of instructions to evaluate models, as described in the following:\n* Task-solving: This type of data comprises various text understanding and processing tasks that test the ability of the language model to perform certain NLP tasks such as summarization, translation, etc.\n* Math-reasoning: This includes math problems and logical reasoning tasks.\n* General-instruction data: This type of data consists of general user-centric instructions, which evaluate the model's ability in general knowledge and writing. Examples for this type can be requests for recommendations, such as \"Suggest three popular books,\" or instructions that require the model to generate creative outputs, like \"Write a short story about a dragon.\"\n* NaturalQA: This consists of queries posted by real users, often in popular local forums, involving local contexts or scenarios. The aim is to test the model's capacity to understand and respond coherently to colloquial language, idioms, and locally contextual references.\n* Safety: This includes both general safety and local context-related safety instructions. The instructions could involve testing the model's understanding of safe practices, its ability to advise on safety rules, or its capacity to respond to safety-related queries. While most general safety questions are translated from open sources, other local context-related safety instructions are written by linguists of each language. Safety data only covers Vietnamese, Indonesian, and Thai.\n\nThe released Sea-bench test set is a small subset that contains 20 questions per task type per language. \n\nSea-bench can be evaluated similar to MT-bench, using GPT-4 as a judge in a score-based grading metrics and a peer comparison (or pairwise comparison) manner.\n\nInstructions to evaluate models on Sea-bench (score-based grading):", "## Evaluation results:\n\n!fig_sea_bench_side_by_side.png\n\nSea-bench is used to evaluate SeaLLMs, a group of language models built with focus in Southeast Asian languages.", "## Contribution\n\nIf you have Sea-bench evaluations on your model that you would like into the aggregated results. Please kindly submit a pull request with an updated model_judgment/gpt-4_single.jsonl\n file. Please use a different model name than the ones indicated in the file.\n\n\nIf you find our project useful, hope you can star our repo and cite our work as follows. Corresponding Author: l.bing@URL" ]
[ "TAGS\n#size_categories-n<1K #language-English #language-Vietnamese #language-Indonesian #language-Khmer #language-Thai #language-Lao #language-Burmese #language-Malay (macrolanguage) #language-Tagalog #language-Chinese #license-apache-2.0 #region-us \n", "# Sea-bench - a benchmark for evaluating chat assistants in Southeast Asian languages.\n\nSee interactive benchmark view at Spaces/SeaLLMs/Sea-bench\n\nWhile there are popular benchmarks to evaluate LLMs as helpful assistants, such as MT-bench, \nthey are only English-based and likely to be unsuitable to evaluate performances in low-resource languages. \nDue to this lack of multilingual benchmarks for assistant-style models, we engaged native linguists to build a multilingual test set with instructions that \ncover 9 Southeast Asian languages, called Sea-bench. \nThe linguists sourced such data by manually translating open-source English test sets, collecting real user questions from local forums and websites, \ncollecting real math and reasoning questions from reputable sources, as well as writing test instructions and questions themselves.\n\nOur Sea-Bench consists of diverse categories of instructions to evaluate models, as described in the following:\n* Task-solving: This type of data comprises various text understanding and processing tasks that test the ability of the language model to perform certain NLP tasks such as summarization, translation, etc.\n* Math-reasoning: This includes math problems and logical reasoning tasks.\n* General-instruction data: This type of data consists of general user-centric instructions, which evaluate the model's ability in general knowledge and writing. Examples for this type can be requests for recommendations, such as \"Suggest three popular books,\" or instructions that require the model to generate creative outputs, like \"Write a short story about a dragon.\"\n* NaturalQA: This consists of queries posted by real users, often in popular local forums, involving local contexts or scenarios. The aim is to test the model's capacity to understand and respond coherently to colloquial language, idioms, and locally contextual references.\n* Safety: This includes both general safety and local context-related safety instructions. The instructions could involve testing the model's understanding of safe practices, its ability to advise on safety rules, or its capacity to respond to safety-related queries. While most general safety questions are translated from open sources, other local context-related safety instructions are written by linguists of each language. Safety data only covers Vietnamese, Indonesian, and Thai.\n\nThe released Sea-bench test set is a small subset that contains 20 questions per task type per language. \n\nSea-bench can be evaluated similar to MT-bench, using GPT-4 as a judge in a score-based grading metrics and a peer comparison (or pairwise comparison) manner.\n\nInstructions to evaluate models on Sea-bench (score-based grading):", "## Evaluation results:\n\n!fig_sea_bench_side_by_side.png\n\nSea-bench is used to evaluate SeaLLMs, a group of language models built with focus in Southeast Asian languages.", "## Contribution\n\nIf you have Sea-bench evaluations on your model that you would like into the aggregated results. Please kindly submit a pull request with an updated model_judgment/gpt-4_single.jsonl\n file. Please use a different model name than the ones indicated in the file.\n\n\nIf you find our project useful, hope you can star our repo and cite our work as follows. Corresponding Author: l.bing@URL" ]
[ 79, 614, 51, 101 ]
[ "passage: TAGS\n#size_categories-n<1K #language-English #language-Vietnamese #language-Indonesian #language-Khmer #language-Thai #language-Lao #language-Burmese #language-Malay (macrolanguage) #language-Tagalog #language-Chinese #license-apache-2.0 #region-us \n" ]
c6da1ea3a6350f7a18b84e3e197666a2f67e3526
# Code Instruct Multiturn Originally from https://huggingface.co/datasets/TokenBender/glaive_coder_raw_text, use Ultrachat template to generate multiturn using ChatGPT3.5 Notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/chatbot/code-instruct-multiturn There are bad pairs you need to reject, follow https://github.com/mesolitica/malaysian-dataset/blob/master/llm-instruction/prepare-malaysian-ultrachat.ipynb - [ultrachat-glaive_coder_raw_text.jsonl](ultrachat-glaive_coder_raw_text.jsonl), 127253 rows, 1.43 GB ## Example data ```python [{'role': 'user', 'content': 'Bagaimana cara untuk memasang Python 3 pada instans AWS EC2? Saya telah mencuba menggunakan arahan `sudo yum install python3`, tetapi saya menerima mesej ralat yang mengatakan `Tiada pakej python3 tersedia.`. Saya telah mencari dalam talian tetapi tidak menemui sebarang penyelesaian. Adakah saya perlu memuat turun dan memasangnya secara manual?', 'content_ms': None, 'indon': False}, {'role': 'assistant', 'content': 'Untuk memasang Python 3 pada instans AWS EC2, anda boleh menggunakan arahan `yum` untuk memasang pakej yang diperlukan. Namun, cara terbaik untuk memasang Python 3 adalah dengan menggunakan EPEL (Extra Packages for Enterprise Linux) repository. Anda boleh menggunakan arahan berikut untuk memasang EPEL repository:\n\n```bash\nsudo yum install epel-release\n```\n\nSelepas itu, anda boleh memasang Python 3 dengan menggunakan arahan berikut:\n\n```bash\nsudo yum install python3\n```\n\nDengan langkah-langkah ini, anda seharusnya dapat memasang Python 3 dengan mudah pada instans AWS EC2 anda. Semoga berjaya!', 'content_ms': 'Untuk memasang Python 3 pada instans AWS EC2, anda boleh menggunakan arahan `yum` untuk memasang pakej yang diperlukan. Walau bagaimanapun, cara terbaik untuk memasang Python 3 adalah dengan menggunakan repositori EPEL (Extra Packages for Enterprise Linux). Anda boleh menggunakan arahan berikut untuk memasang repositori EPEL:\n\n```bash\nsudo yum install epel-release\n```\n\nSelepas itu, anda boleh memasang Python 3 dengan menggunakan arahan berikut:\n\n```bash\nsudo yum install python3\n```\n\nDengan langkah-langkah ini, anda sepatutnya dapat memasang Python 3 dengan mudah pada instans AWS EC2 anda. Semoga berjaya!', 'indon': False}, {'role': 'user', 'content': 'Terima kasih atas panduannya. Saya ingin tahu apakah pakej-pakej lain yang perlu saya pasang bagi memastikan Python 3 berfungsi dengan baik? Adakah ada langkah-langkah tambahan yang perlu saya ikuti setelah memasang Python 3? Saya ingin memastikan bahawa semua kebergantungan dan konfigurasi yang diperlukan dipenuhi.', 'content_ms': 'Terima kasih atas panduan. Saya ingin tahu apakah pakej lain yang perlu saya pasang untuk memastikan Python 3 berfungsi dengan baik? Adakah terdapat langkah tambahan yang perlu saya ikuti selepas memasang Python 3? Saya ingin memastikan semua kebergantungan dan konfigurasi yang diperlukan dipenuhi.', 'indon': False}, {'role': 'assistant', 'content': 'Setelah anda memasang Python 3, terdapat beberapa pakej tambahan yang disyorkan untuk memastikan ia berfungsi dengan baik. Berikut adalah beberapa pakej yang perlu anda pasang:\n\n1. Setuptools: Pakej ini menyediakan alat bantu untuk mengelola dan memasang pustaka Python. Anda boleh memasangnya dengan menggunakan pip dengan arahan berikut: \n```bash\nsudo pip3 install setuptools\n```\n\n2. Pip: Pakej ini merupakan pengurus pustaka Python yang popular. Ia membolehkan anda memasang dan menguruskan pustaka pihak ketiga dengan mudah. Anda boleh memasangnya dengan menggunakan arahan berikut:\n```bash\nsudo yum install python3-pip\n```\n\n3. Virtualenv: Pakej ini membolehkan anda membuat persekitaran Python yang berasingan untuk setiap projek. Ini adalah cara yang baik untuk mengurangkan konflik kebergantungan antara projek yang berbeza. Anda boleh memasangnya dengan menggunakan pip:\n```bash\nsudo pip3 install virtualenv\n```\n\nSetelah memasang pakej-pakej ini, anda sepatutnya mempunyai pengaturan asas untuk menggunakan Python 3 secara efektif. Jika anda memperlukan konfigurasi tambahan untuk kegunaan tertentu, ia akan bergantung kepada keperluan projek anda. Tetapi dalam kebanyakan kes, langkah-langkah ini sudah mencukupi untuk memastikan Python 3 berfungsi dengan baik. Jika anda menghadapi sebarang isu atau memerlukan bantuan tambahan, jangan segan untuk bertanya!', 'content_ms': 'Selepas anda memasang Python 3, terdapat beberapa pakej tambahan yang disyorkan untuk memastikan ia berfungsi dengan baik. Berikut adalah beberapa pakej yang anda perlu pasang:\n\n1. Setuptools: Pakej ini menyediakan alat bantu untuk menguruskan dan memasang perpustakaan Python. Anda boleh memasangnya dengan menggunakan pip dengan arahan berikut:\n```bash\nsudo pip3 install setuptools\n```\n\n2. Pip: Pakej ini adalah pengurus perpustakaan Python yang popular. Ia membolehkan anda memasang dan menguruskan perpustakaan pihak ketiga dengan mudah. Anda boleh memasangnya dengan menggunakan arahan berikut:\n```bash\nsudo yum install python3-pip\n```\n\n3. Virtualenv: Pakej ini membolehkan anda membuat persekitaran Python yang berasingan untuk setiap projek. Ini adalah cara yang baik untuk mengurangkan konflik kebergantungan antara projek yang berbeza. Anda boleh memasangnya dengan menggunakan pip:\n```bash\nsudo pip3 install virtualenv\n```\n\nSetelah memasang pakej-pakej ini, anda sepatutnya mempunyai pengaturan asas untuk menggunakan Python 3 dengan berkesan. Jika anda memerlukan konfigurasi tambahan untuk kegunaan tertentu, ia akan bergantung pada keperluan projek anda. Tetapi dalam kebanyakan kes, langkah-langkah ini sudah mencukupi untuk memastikan Python 3 berfungsi dengan baik. Jika anda menghadapi sebarang masalah atau memerlukan bantuan tambahan, jangan ragu untuk bertanya!', 'indon': False}] ```
mesolitica/chatgpt-code-instruct-multiturn
[ "task_categories:conversational", "language:ms", "region:us" ]
2023-12-04T04:01:33+00:00
{"language": ["ms"], "task_categories": ["conversational"]}
2024-02-02T08:23:33+00:00
[]
[ "ms" ]
TAGS #task_categories-conversational #language-Malay (macrolanguage) #region-us
# Code Instruct Multiturn Originally from URL use Ultrachat template to generate multiturn using ChatGPT3.5 Notebooks at URL There are bad pairs you need to reject, follow URL - ultrachat-glaive_coder_raw_text.jsonl, 127253 rows, 1.43 GB ## Example data bash\nsudo yum install epel-release\nbash\nsudo yum install python3\nbash\nsudo yum install epel-release\nbash\nsudo yum install python3\nbash\nsudo pip3 install setuptools\nbash\nsudo yum install python3-pip\nbash\nsudo pip3 install virtualenv\nbash\nsudo pip3 install setuptools\nbash\nsudo yum install python3-pip\nbash\nsudo pip3 install virtualenv\n
[ "# Code Instruct Multiturn\n\nOriginally from URL use Ultrachat template to generate multiturn using ChatGPT3.5\n\nNotebooks at URL\n\nThere are bad pairs you need to reject, follow URL\n\n- ultrachat-glaive_coder_raw_text.jsonl, 127253 rows, 1.43 GB", "## Example data\n\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\n" ]
[ "TAGS\n#task_categories-conversational #language-Malay (macrolanguage) #region-us \n", "# Code Instruct Multiturn\n\nOriginally from URL use Ultrachat template to generate multiturn using ChatGPT3.5\n\nNotebooks at URL\n\nThere are bad pairs you need to reject, follow URL\n\n- ultrachat-glaive_coder_raw_text.jsonl, 127253 rows, 1.43 GB", "## Example data\n\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\n" ]
[ 26, 70, 129 ]
[ "passage: TAGS\n#task_categories-conversational #language-Malay (macrolanguage) #region-us \n# Code Instruct Multiturn\n\nOriginally from URL use Ultrachat template to generate multiturn using ChatGPT3.5\n\nNotebooks at URL\n\nThere are bad pairs you need to reject, follow URL\n\n- ultrachat-glaive_coder_raw_text.jsonl, 127253 rows, 1.43 GB## Example data\n\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo yum install epel-release\\nbash\\nsudo yum install python3\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\nbash\\nsudo pip3 install setuptools\\nbash\\nsudo yum install python3-pip\\nbash\\nsudo pip3 install virtualenv\\n" ]
61ec86634fe97cabd3b687b06f7a458a97aab7ef
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.6.3.dev0
minimindy/lora-checkpoint-50
[ "arxiv:1910.09700", "region:us" ]
2023-12-04T04:01:57+00:00
{"library_name": "peft", "base_model": "baffo32/decapoda-research-llama-7B-hf"}
2023-12-04T05:58:17+00:00
[ "1910.09700" ]
[]
TAGS #arxiv-1910.09700 #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ## Training procedure The following 'bitsandbytes' quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.6.3.dev0
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "## Training procedure\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32", "### Framework versions\n\n- PEFT 0.6.3.dev0" ]
[ "TAGS\n#arxiv-1910.09700 #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "## Training procedure\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: True\n- load_in_4bit: False\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: fp4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float32", "### Framework versions\n\n- PEFT 0.6.3.dev0" ]
[ 15, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 164, 13 ]
[ "passage: TAGS\n#arxiv-1910.09700 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
4ba8e43ab50177eed64df3a96e78d898978d5771
### Dataset The glue consists of a roughly even three-way split between: - The entirety of HF No Robots. - The entirety of TinyPixel/orca-mini - Part of the Alpaca dataset (randomly chosen)
athirdpath/Merge_Glue
[ "language:en", "license:cc-by-nc-4.0", "region:us" ]
2023-12-04T04:29:39+00:00
{"language": ["en"], "license": "cc-by-nc-4.0"}
2023-12-05T11:00:18+00:00
[]
[ "en" ]
TAGS #language-English #license-cc-by-nc-4.0 #region-us
### Dataset The glue consists of a roughly even three-way split between: - The entirety of HF No Robots. - The entirety of TinyPixel/orca-mini - Part of the Alpaca dataset (randomly chosen)
[ "### Dataset\n\nThe glue consists of a roughly even three-way split between:\n- The entirety of HF No Robots.\n- The entirety of TinyPixel/orca-mini\n- Part of the Alpaca dataset (randomly chosen)" ]
[ "TAGS\n#language-English #license-cc-by-nc-4.0 #region-us \n", "### Dataset\n\nThe glue consists of a roughly even three-way split between:\n- The entirety of HF No Robots.\n- The entirety of TinyPixel/orca-mini\n- Part of the Alpaca dataset (randomly chosen)" ]
[ 21, 60 ]
[ "passage: TAGS\n#language-English #license-cc-by-nc-4.0 #region-us \n### Dataset\n\nThe glue consists of a roughly even three-way split between:\n- The entirety of HF No Robots.\n- The entirety of TinyPixel/orca-mini\n- Part of the Alpaca dataset (randomly chosen)" ]
dabc6f941af13714bc347b14efd5dcaf81629b80
# Bangumi Image Base of Kami Nomi Zo Shiru Sekai This is the image base of bangumi Kami Nomi zo Shiru Sekai, we detected 60 characters, 5684 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 331 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 1290 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 64 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 289 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 77 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 244 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 18 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 74 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 27 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 29 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 49 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 129 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 114 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 492 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 353 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 38 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 95 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 25 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 37 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 14 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 49 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 208 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 13 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 19 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 136 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 41 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 30 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 29 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 12 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 26 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 84 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 324 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 16 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 11 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 7 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | N/A | | 35 | 17 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 25 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 18 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 50 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 21 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 5 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | N/A | N/A | N/A | | 41 | 51 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 10 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 7 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | ![preview 6](43/preview_6.png) | ![preview 7](43/preview_7.png) | N/A | | 44 | 12 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | ![preview 8](44/preview_8.png) | | 45 | 18 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | 46 | 10 | [Download](46/dataset.zip) | ![preview 1](46/preview_1.png) | ![preview 2](46/preview_2.png) | ![preview 3](46/preview_3.png) | ![preview 4](46/preview_4.png) | ![preview 5](46/preview_5.png) | ![preview 6](46/preview_6.png) | ![preview 7](46/preview_7.png) | ![preview 8](46/preview_8.png) | | 47 | 7 | [Download](47/dataset.zip) | ![preview 1](47/preview_1.png) | ![preview 2](47/preview_2.png) | ![preview 3](47/preview_3.png) | ![preview 4](47/preview_4.png) | ![preview 5](47/preview_5.png) | ![preview 6](47/preview_6.png) | ![preview 7](47/preview_7.png) | N/A | | 48 | 14 | [Download](48/dataset.zip) | ![preview 1](48/preview_1.png) | ![preview 2](48/preview_2.png) | ![preview 3](48/preview_3.png) | ![preview 4](48/preview_4.png) | ![preview 5](48/preview_5.png) | ![preview 6](48/preview_6.png) | ![preview 7](48/preview_7.png) | ![preview 8](48/preview_8.png) | | 49 | 20 | [Download](49/dataset.zip) | ![preview 1](49/preview_1.png) | ![preview 2](49/preview_2.png) | ![preview 3](49/preview_3.png) | ![preview 4](49/preview_4.png) | ![preview 5](49/preview_5.png) | ![preview 6](49/preview_6.png) | ![preview 7](49/preview_7.png) | ![preview 8](49/preview_8.png) | | 50 | 22 | [Download](50/dataset.zip) | ![preview 1](50/preview_1.png) | ![preview 2](50/preview_2.png) | ![preview 3](50/preview_3.png) | ![preview 4](50/preview_4.png) | ![preview 5](50/preview_5.png) | ![preview 6](50/preview_6.png) | ![preview 7](50/preview_7.png) | ![preview 8](50/preview_8.png) | | 51 | 19 | [Download](51/dataset.zip) | ![preview 1](51/preview_1.png) | ![preview 2](51/preview_2.png) | ![preview 3](51/preview_3.png) | ![preview 4](51/preview_4.png) | ![preview 5](51/preview_5.png) | ![preview 6](51/preview_6.png) | ![preview 7](51/preview_7.png) | ![preview 8](51/preview_8.png) | | 52 | 7 | [Download](52/dataset.zip) | ![preview 1](52/preview_1.png) | ![preview 2](52/preview_2.png) | ![preview 3](52/preview_3.png) | ![preview 4](52/preview_4.png) | ![preview 5](52/preview_5.png) | ![preview 6](52/preview_6.png) | ![preview 7](52/preview_7.png) | N/A | | 53 | 12 | [Download](53/dataset.zip) | ![preview 1](53/preview_1.png) | ![preview 2](53/preview_2.png) | ![preview 3](53/preview_3.png) | ![preview 4](53/preview_4.png) | ![preview 5](53/preview_5.png) | ![preview 6](53/preview_6.png) | ![preview 7](53/preview_7.png) | ![preview 8](53/preview_8.png) | | 54 | 16 | [Download](54/dataset.zip) | ![preview 1](54/preview_1.png) | ![preview 2](54/preview_2.png) | ![preview 3](54/preview_3.png) | ![preview 4](54/preview_4.png) | ![preview 5](54/preview_5.png) | ![preview 6](54/preview_6.png) | ![preview 7](54/preview_7.png) | ![preview 8](54/preview_8.png) | | 55 | 5 | [Download](55/dataset.zip) | ![preview 1](55/preview_1.png) | ![preview 2](55/preview_2.png) | ![preview 3](55/preview_3.png) | ![preview 4](55/preview_4.png) | ![preview 5](55/preview_5.png) | N/A | N/A | N/A | | 56 | 6 | [Download](56/dataset.zip) | ![preview 1](56/preview_1.png) | ![preview 2](56/preview_2.png) | ![preview 3](56/preview_3.png) | ![preview 4](56/preview_4.png) | ![preview 5](56/preview_5.png) | ![preview 6](56/preview_6.png) | N/A | N/A | | 57 | 23 | [Download](57/dataset.zip) | ![preview 1](57/preview_1.png) | ![preview 2](57/preview_2.png) | ![preview 3](57/preview_3.png) | ![preview 4](57/preview_4.png) | ![preview 5](57/preview_5.png) | ![preview 6](57/preview_6.png) | ![preview 7](57/preview_7.png) | ![preview 8](57/preview_8.png) | | 58 | 6 | [Download](58/dataset.zip) | ![preview 1](58/preview_1.png) | ![preview 2](58/preview_2.png) | ![preview 3](58/preview_3.png) | ![preview 4](58/preview_4.png) | ![preview 5](58/preview_5.png) | ![preview 6](58/preview_6.png) | N/A | N/A | | noise | 489 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/kaminomizoshirusekai
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-12-04T05:16:11+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-12-04T08:33:31+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Kami Nomi Zo Shiru Sekai ============================================== This is the image base of bangumi Kami Nomi zo Shiru Sekai, we detected 60 characters, 5684 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
cf77665e7d6dc28a9b57cb9155a3c5406e902ca0
# Bangumi Image Base of Kimi Ni Todoke This is the image base of bangumi Kimi ni Todoke, we detected 31 characters, 4387 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 897 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 234 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 778 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 36 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 48 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 166 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 222 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 38 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 52 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 34 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 16 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 408 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 571 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 27 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 238 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 46 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 19 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 48 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 42 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 29 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 47 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 86 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 62 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 18 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 18 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 89 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 17 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 25 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 17 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 6 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | N/A | N/A | | noise | 53 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/kiminitodoke
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-12-04T05:16:29+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-12-04T08:20:23+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Kimi Ni Todoke ==================================== This is the image base of bangumi Kimi ni Todoke, we detected 31 characters, 4387 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
a6e5bbc38bd072b251ac2c9e25ab3d10462677d0
# Bangumi Image Base of Ef - A Tale This is the image base of bangumi Ef - A Tale, we detected 21 characters, 2004 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 101 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 52 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 127 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 252 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 157 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 193 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 53 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 81 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 21 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 15 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 323 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 103 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 11 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 24 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 10 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 24 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 76 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 44 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 155 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 27 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | noise | 155 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/efatale
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-12-04T05:17:05+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-12-04T06:58:35+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Ef - A Tale ================================= This is the image base of bangumi Ef - A Tale, we detected 21 characters, 2004 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
2821c8cd2744932a4a0992607d1aea96e40c6405
# Bangumi Image Base of Kaichou Wa Maid-sama! This is the image base of bangumi Kaichou wa Maid-sama!, we detected 47 characters, 3607 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 60 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 46 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 6 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | N/A | N/A | | 3 | 423 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 55 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 144 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 68 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 9 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 204 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 124 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 106 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 34 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 66 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 78 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 40 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 70 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 55 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 32 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 70 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | ![preview 6](18/preview_6.png) | ![preview 7](18/preview_7.png) | ![preview 8](18/preview_8.png) | | 19 | 854 | [Download](19/dataset.zip) | ![preview 1](19/preview_1.png) | ![preview 2](19/preview_2.png) | ![preview 3](19/preview_3.png) | ![preview 4](19/preview_4.png) | ![preview 5](19/preview_5.png) | ![preview 6](19/preview_6.png) | ![preview 7](19/preview_7.png) | ![preview 8](19/preview_8.png) | | 20 | 64 | [Download](20/dataset.zip) | ![preview 1](20/preview_1.png) | ![preview 2](20/preview_2.png) | ![preview 3](20/preview_3.png) | ![preview 4](20/preview_4.png) | ![preview 5](20/preview_5.png) | ![preview 6](20/preview_6.png) | ![preview 7](20/preview_7.png) | ![preview 8](20/preview_8.png) | | 21 | 50 | [Download](21/dataset.zip) | ![preview 1](21/preview_1.png) | ![preview 2](21/preview_2.png) | ![preview 3](21/preview_3.png) | ![preview 4](21/preview_4.png) | ![preview 5](21/preview_5.png) | ![preview 6](21/preview_6.png) | ![preview 7](21/preview_7.png) | ![preview 8](21/preview_8.png) | | 22 | 27 | [Download](22/dataset.zip) | ![preview 1](22/preview_1.png) | ![preview 2](22/preview_2.png) | ![preview 3](22/preview_3.png) | ![preview 4](22/preview_4.png) | ![preview 5](22/preview_5.png) | ![preview 6](22/preview_6.png) | ![preview 7](22/preview_7.png) | ![preview 8](22/preview_8.png) | | 23 | 106 | [Download](23/dataset.zip) | ![preview 1](23/preview_1.png) | ![preview 2](23/preview_2.png) | ![preview 3](23/preview_3.png) | ![preview 4](23/preview_4.png) | ![preview 5](23/preview_5.png) | ![preview 6](23/preview_6.png) | ![preview 7](23/preview_7.png) | ![preview 8](23/preview_8.png) | | 24 | 134 | [Download](24/dataset.zip) | ![preview 1](24/preview_1.png) | ![preview 2](24/preview_2.png) | ![preview 3](24/preview_3.png) | ![preview 4](24/preview_4.png) | ![preview 5](24/preview_5.png) | ![preview 6](24/preview_6.png) | ![preview 7](24/preview_7.png) | ![preview 8](24/preview_8.png) | | 25 | 24 | [Download](25/dataset.zip) | ![preview 1](25/preview_1.png) | ![preview 2](25/preview_2.png) | ![preview 3](25/preview_3.png) | ![preview 4](25/preview_4.png) | ![preview 5](25/preview_5.png) | ![preview 6](25/preview_6.png) | ![preview 7](25/preview_7.png) | ![preview 8](25/preview_8.png) | | 26 | 26 | [Download](26/dataset.zip) | ![preview 1](26/preview_1.png) | ![preview 2](26/preview_2.png) | ![preview 3](26/preview_3.png) | ![preview 4](26/preview_4.png) | ![preview 5](26/preview_5.png) | ![preview 6](26/preview_6.png) | ![preview 7](26/preview_7.png) | ![preview 8](26/preview_8.png) | | 27 | 22 | [Download](27/dataset.zip) | ![preview 1](27/preview_1.png) | ![preview 2](27/preview_2.png) | ![preview 3](27/preview_3.png) | ![preview 4](27/preview_4.png) | ![preview 5](27/preview_5.png) | ![preview 6](27/preview_6.png) | ![preview 7](27/preview_7.png) | ![preview 8](27/preview_8.png) | | 28 | 31 | [Download](28/dataset.zip) | ![preview 1](28/preview_1.png) | ![preview 2](28/preview_2.png) | ![preview 3](28/preview_3.png) | ![preview 4](28/preview_4.png) | ![preview 5](28/preview_5.png) | ![preview 6](28/preview_6.png) | ![preview 7](28/preview_7.png) | ![preview 8](28/preview_8.png) | | 29 | 24 | [Download](29/dataset.zip) | ![preview 1](29/preview_1.png) | ![preview 2](29/preview_2.png) | ![preview 3](29/preview_3.png) | ![preview 4](29/preview_4.png) | ![preview 5](29/preview_5.png) | ![preview 6](29/preview_6.png) | ![preview 7](29/preview_7.png) | ![preview 8](29/preview_8.png) | | 30 | 28 | [Download](30/dataset.zip) | ![preview 1](30/preview_1.png) | ![preview 2](30/preview_2.png) | ![preview 3](30/preview_3.png) | ![preview 4](30/preview_4.png) | ![preview 5](30/preview_5.png) | ![preview 6](30/preview_6.png) | ![preview 7](30/preview_7.png) | ![preview 8](30/preview_8.png) | | 31 | 35 | [Download](31/dataset.zip) | ![preview 1](31/preview_1.png) | ![preview 2](31/preview_2.png) | ![preview 3](31/preview_3.png) | ![preview 4](31/preview_4.png) | ![preview 5](31/preview_5.png) | ![preview 6](31/preview_6.png) | ![preview 7](31/preview_7.png) | ![preview 8](31/preview_8.png) | | 32 | 37 | [Download](32/dataset.zip) | ![preview 1](32/preview_1.png) | ![preview 2](32/preview_2.png) | ![preview 3](32/preview_3.png) | ![preview 4](32/preview_4.png) | ![preview 5](32/preview_5.png) | ![preview 6](32/preview_6.png) | ![preview 7](32/preview_7.png) | ![preview 8](32/preview_8.png) | | 33 | 10 | [Download](33/dataset.zip) | ![preview 1](33/preview_1.png) | ![preview 2](33/preview_2.png) | ![preview 3](33/preview_3.png) | ![preview 4](33/preview_4.png) | ![preview 5](33/preview_5.png) | ![preview 6](33/preview_6.png) | ![preview 7](33/preview_7.png) | ![preview 8](33/preview_8.png) | | 34 | 20 | [Download](34/dataset.zip) | ![preview 1](34/preview_1.png) | ![preview 2](34/preview_2.png) | ![preview 3](34/preview_3.png) | ![preview 4](34/preview_4.png) | ![preview 5](34/preview_5.png) | ![preview 6](34/preview_6.png) | ![preview 7](34/preview_7.png) | ![preview 8](34/preview_8.png) | | 35 | 13 | [Download](35/dataset.zip) | ![preview 1](35/preview_1.png) | ![preview 2](35/preview_2.png) | ![preview 3](35/preview_3.png) | ![preview 4](35/preview_4.png) | ![preview 5](35/preview_5.png) | ![preview 6](35/preview_6.png) | ![preview 7](35/preview_7.png) | ![preview 8](35/preview_8.png) | | 36 | 19 | [Download](36/dataset.zip) | ![preview 1](36/preview_1.png) | ![preview 2](36/preview_2.png) | ![preview 3](36/preview_3.png) | ![preview 4](36/preview_4.png) | ![preview 5](36/preview_5.png) | ![preview 6](36/preview_6.png) | ![preview 7](36/preview_7.png) | ![preview 8](36/preview_8.png) | | 37 | 56 | [Download](37/dataset.zip) | ![preview 1](37/preview_1.png) | ![preview 2](37/preview_2.png) | ![preview 3](37/preview_3.png) | ![preview 4](37/preview_4.png) | ![preview 5](37/preview_5.png) | ![preview 6](37/preview_6.png) | ![preview 7](37/preview_7.png) | ![preview 8](37/preview_8.png) | | 38 | 9 | [Download](38/dataset.zip) | ![preview 1](38/preview_1.png) | ![preview 2](38/preview_2.png) | ![preview 3](38/preview_3.png) | ![preview 4](38/preview_4.png) | ![preview 5](38/preview_5.png) | ![preview 6](38/preview_6.png) | ![preview 7](38/preview_7.png) | ![preview 8](38/preview_8.png) | | 39 | 104 | [Download](39/dataset.zip) | ![preview 1](39/preview_1.png) | ![preview 2](39/preview_2.png) | ![preview 3](39/preview_3.png) | ![preview 4](39/preview_4.png) | ![preview 5](39/preview_5.png) | ![preview 6](39/preview_6.png) | ![preview 7](39/preview_7.png) | ![preview 8](39/preview_8.png) | | 40 | 10 | [Download](40/dataset.zip) | ![preview 1](40/preview_1.png) | ![preview 2](40/preview_2.png) | ![preview 3](40/preview_3.png) | ![preview 4](40/preview_4.png) | ![preview 5](40/preview_5.png) | ![preview 6](40/preview_6.png) | ![preview 7](40/preview_7.png) | ![preview 8](40/preview_8.png) | | 41 | 41 | [Download](41/dataset.zip) | ![preview 1](41/preview_1.png) | ![preview 2](41/preview_2.png) | ![preview 3](41/preview_3.png) | ![preview 4](41/preview_4.png) | ![preview 5](41/preview_5.png) | ![preview 6](41/preview_6.png) | ![preview 7](41/preview_7.png) | ![preview 8](41/preview_8.png) | | 42 | 9 | [Download](42/dataset.zip) | ![preview 1](42/preview_1.png) | ![preview 2](42/preview_2.png) | ![preview 3](42/preview_3.png) | ![preview 4](42/preview_4.png) | ![preview 5](42/preview_5.png) | ![preview 6](42/preview_6.png) | ![preview 7](42/preview_7.png) | ![preview 8](42/preview_8.png) | | 43 | 5 | [Download](43/dataset.zip) | ![preview 1](43/preview_1.png) | ![preview 2](43/preview_2.png) | ![preview 3](43/preview_3.png) | ![preview 4](43/preview_4.png) | ![preview 5](43/preview_5.png) | N/A | N/A | N/A | | 44 | 7 | [Download](44/dataset.zip) | ![preview 1](44/preview_1.png) | ![preview 2](44/preview_2.png) | ![preview 3](44/preview_3.png) | ![preview 4](44/preview_4.png) | ![preview 5](44/preview_5.png) | ![preview 6](44/preview_6.png) | ![preview 7](44/preview_7.png) | N/A | | 45 | 12 | [Download](45/dataset.zip) | ![preview 1](45/preview_1.png) | ![preview 2](45/preview_2.png) | ![preview 3](45/preview_3.png) | ![preview 4](45/preview_4.png) | ![preview 5](45/preview_5.png) | ![preview 6](45/preview_6.png) | ![preview 7](45/preview_7.png) | ![preview 8](45/preview_8.png) | | noise | 140 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
BangumiBase/kaichouwamaidsama
[ "size_categories:1K<n<10K", "license:mit", "art", "region:us" ]
2023-12-04T05:17:21+00:00
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
2023-12-04T07:11:45+00:00
[]
[]
TAGS #size_categories-1K<n<10K #license-mit #art #region-us
Bangumi Image Base of Kaichou Wa Maid-sama! =========================================== This is the image base of bangumi Kaichou wa Maid-sama!, we detected 47 characters, 3607 images in total. The full dataset is here. Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview:
[]
[ "TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
[ 25 ]
[ "passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n" ]
547412eecd50d7f87c8357b2de4d4b2987215217
# Dataset Card for "sni-10k-flat" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sordonia/sni-10k-flat
[ "region:us" ]
2023-12-04T05:32:08+00:00
{"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 715867937, "num_examples": 549506}], "download_size": 204866337, "dataset_size": 715867937}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T13:49:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sni-10k-flat" More Information needed
[ "# Dataset Card for \"sni-10k-flat\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sni-10k-flat\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"sni-10k-flat\"\n\nMore Information needed" ]
ba4815d6a62f9607c02b6b016b3eebfd47ae5aeb
# OpenPlatypus-Commercial version This dataset is focused on improving LLM logical reasoning skills and was used to train the Platypus2 models. It is comprised of the following datasets, which were filtered using keyword search and then Sentence Transformers to remove questions with a similarity above 80%: | Dataset Name | License Type | |--------------------------------------------------------------|--------------| | [PRM800K](https://github.com/openai/prm800k) | MIT | | [SciBench](https://github.com/mandyyyyii/scibench) | MIT | | [TheoremQA](https://huggingface.co/datasets/wenhu/TheoremQA) | MIT | | [`nuprl/leetcode-solutions-python-testgen-gpt4`](https://huggingface.co/datasets/nuprl/leetcode-solutions-python-testgen-gpt4/viewer/nuprl--leetcode-solutions-python-testgen-gpt4/train?p=1) | None listed | | [`jondurbin/airoboros-gpt4-1.4.1`](https://huggingface.co/datasets/jondurbin/airoboros-gpt4-1.4.1) | other | | [`TigerResearch/tigerbot-kaggle-leetcodesolutions-en-2k`](https://huggingface.co/datasets/TigerResearch/tigerbot-kaggle-leetcodesolutions-en-2k/viewer/TigerResearch--tigerbot-kaggle-leetcodesolutions-en-2k/train?p=2) | apache-2.0 | | [openbookQA](https://huggingface.co/datasets/openbookqa/viewer/additional/train?row=35) | apache-2.0 | | [ARB](https://arb.duckai.org) | MIT | | [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) | apache-2.0 | --- > Original model dataset ## Data Contamination Check We've removed approximately 200 questions that appear in the Hugging Face benchmark test sets. Please see our [paper](https://arxiv.org/abs/2308.07317) and [project webpage](https://platypus-llm.github.io) for additional information. ## Model Info Please see models at [`garage-bAInd`](https://huggingface.co/garage-bAInd). ## Training and filtering code Please see the [Platypus GitHub repo](https://github.com/arielnlee/Platypus). ## Citations ```bibtex @article{platypus2023, title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs}, author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz}, booktitle={arXiv preprint arxiv:2308.07317}, year={2023} } ``` ```bibtex @article{lightman2023lets, title={Let's Verify Step by Step}, author={Lightman, Hunter and Kosaraju, Vineet and Burda, Yura and Edwards, Harri and Baker, Bowen and Lee, Teddy and Leike, Jan and Schulman, John and Sutskever, Ilya and Cobbe, Karl}, journal={preprint arXiv:2305.20050}, year={2023} } ``` ```bibtex @inproceedings{lu2022learn, title={Learn to Explain: Multimodal Reasoning via Thought Chains for Science Question Answering}, author={Lu, Pan and Mishra, Swaroop and Xia, Tony and Qiu, Liang and Chang, Kai-Wei and Zhu, Song-Chun and Tafjord, Oyvind and Clark, Peter and Ashwin Kalyan}, booktitle={The 36th Conference on Neural Information Processing Systems (NeurIPS)}, year={2022} } ``` ```bibtex @misc{wang2023scibench, title={SciBench: Evaluating College-Level Scientific Problem-Solving Abilities of Large Language Models}, author={Xiaoxuan Wang and Ziniu Hu and Pan Lu and Yanqiao Zhu and Jieyu Zhang and Satyen Subramaniam and Arjun R. Loomba and Shichang Zhang and Yizhou Sun and Wei Wang}, year={2023}, arXiv eprint 2307.10635 } ``` ```bibtex @inproceedings{yu2020reclor, author = {Yu, Weihao and Jiang, Zihang and Dong, Yanfei and Feng, Jiashi}, title = {ReClor: A Reading Comprehension Dataset Requiring Logical Reasoning}, booktitle = {International Conference on Learning Representations (ICLR)}, month = {April}, year = {2020} } ``` ```bibtex @article{chen2023theoremqa, title={TheoremQA: A Theorem-driven Question Answering dataset}, author={Chen, Wenhu and Ming Yin, Max Ku, Elaine Wan, Xueguang Ma, Jianyu Xu, Tony Xia, Xinyi Wang, Pan Lu}, journal={preprint arXiv:2305.12524}, year={2023} } ``` ```bibtex @inproceedings{OpenBookQA2018, title={Can a Suit of Armor Conduct Electricity? A New Dataset for Open Book Question Answering}, author={Todor Mihaylov and Peter Clark and Tushar Khot and Ashish Sabharwal}, booktitle={EMNLP}, year={2018} } ``` ```bibtex @misc{sawada2023arb, title={ARB: Advanced Reasoning Benchmark for Large Language Models}, author={Tomohiro Sawada and Daniel Paleka and Alexander Havrilla and Pranav Tadepalli and Paula Vidas and Alexander Kranias and John J. Nay and Kshitij Gupta and Aran Komatsuzaki}, arXiv eprint 2307.13692, year={2023} } ```
kyujinpy/Open-platypus-Commercial
[ "license:mit", "arxiv:2308.07317", "region:us" ]
2023-12-04T05:35:35+00:00
{"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "data_source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25149248, "num_examples": 19079}], "download_size": 12772956, "dataset_size": 25149248}}
2023-12-04T05:52:34+00:00
[ "2308.07317" ]
[]
TAGS #license-mit #arxiv-2308.07317 #region-us
OpenPlatypus-Commercial version =============================== This dataset is focused on improving LLM logical reasoning skills and was used to train the Platypus2 models. It is comprised of the following datasets, which were filtered using keyword search and then Sentence Transformers to remove questions with a similarity above 80%: --- > > Original model dataset > > > Data Contamination Check ------------------------ We've removed approximately 200 questions that appear in the Hugging Face benchmark test sets. Please see our paper and project webpage for additional information. Model Info ---------- Please see models at 'garage-bAInd'. Training and filtering code --------------------------- Please see the Platypus GitHub repo. s
[]
[ "TAGS\n#license-mit #arxiv-2308.07317 #region-us \n" ]
[ 20 ]
[ "passage: TAGS\n#license-mit #arxiv-2308.07317 #region-us \n" ]
e7cdb94d0c9483a32a7815baff60677df471684a
# Dataset Card for "single_med_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
austin/single_med_data
[ "region:us" ]
2023-12-04T05:45:03+00:00
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10510513.084627181, "num_examples": 20450}, {"name": "eval", "num_bytes": 214835.91537281964, "num_examples": 418}], "download_size": 2108422, "dataset_size": 10725349.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}]}]}
2023-12-04T05:49:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for "single_med_data" More Information needed
[ "# Dataset Card for \"single_med_data\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"single_med_data\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"single_med_data\"\n\nMore Information needed" ]
a5187dbc91fab195e8212e1c3876fe1aff2645b4
# Dataset Card for "PubMedQA-ShareGPT" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Medilora/PubMedQA-ShareGPT
[ "region:us" ]
2023-12-04T06:05:27+00:00
{"dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 381454267, "num_examples": 211269}], "download_size": 200512585, "dataset_size": 381454267}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T06:08:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "PubMedQA-ShareGPT" More Information needed
[ "# Dataset Card for \"PubMedQA-ShareGPT\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"PubMedQA-ShareGPT\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"PubMedQA-ShareGPT\"\n\nMore Information needed" ]
4f7f03b22434389243f4584345f1c79d7eb27093
Description - The Asgaard and Bingsu datasets are collections of PNG image files derived from various popular video games. - These datasets are intended for use in computer vision research, particularly in areas related to gaming environments, character recognition, and scene understanding. Asgaard Dataset - Size: 10,000 PNG image files - Number of Images per Game: 1,000 - Games Included: - Witcher 3 - Ghost of Tsushima - Cyberpunk 2077 - Elden Ring - GTA - Just Cause 3 - Resident Evil 4 - The Last of Us - Skyrim - Call of Duty Bingsu Dataset - Size: 10,000 PNG image files - Number of Images per Game: 1,000 - Games Included: - Among Us - Apex Legends - Fortnite - Forza Horizon - Free Fire - Genshin Impact - God of War - Minecraft - Roblox - Terraria Dataset Structure - Data Format: PNG images File Structure: - Each game has its own directory named after the game. - Each directory contains 1,000 PNG images related to the game.
successtiger/gclip
[ "region:us" ]
2023-12-04T06:11:35+00:00
{}
2023-12-04T06:40:27+00:00
[]
[]
TAGS #region-us
Description - The Asgaard and Bingsu datasets are collections of PNG image files derived from various popular video games. - These datasets are intended for use in computer vision research, particularly in areas related to gaming environments, character recognition, and scene understanding. Asgaard Dataset - Size: 10,000 PNG image files - Number of Images per Game: 1,000 - Games Included: - Witcher 3 - Ghost of Tsushima - Cyberpunk 2077 - Elden Ring - GTA - Just Cause 3 - Resident Evil 4 - The Last of Us - Skyrim - Call of Duty Bingsu Dataset - Size: 10,000 PNG image files - Number of Images per Game: 1,000 - Games Included: - Among Us - Apex Legends - Fortnite - Forza Horizon - Free Fire - Genshin Impact - God of War - Minecraft - Roblox - Terraria Dataset Structure - Data Format: PNG images File Structure: - Each game has its own directory named after the game. - Each directory contains 1,000 PNG images related to the game.
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
4b5bbf5fb08dd7dd296e506ab572bff0e966c75d
<div align="center"> <img src="Yi_logo.svg" width="150px" style="display: inline-block;"> <img src="m-a-p.png" width="150px" style="display: inline-block;"> </div> # Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment ## Table of Contents - [Overview](#overview) - [Dataset Description](#dataset-description) - [Usage](#usage) - [Citation](#citation) - [Acknowledgments](#acknowledgments) ## Overview The COIG-Kun dataset, part of the [COIG-Kun GitHub](https://github.com/Zheng0428/COIG-Kun) project, consists of instructional data used for training language models. This dataset was developed following the methodology inspired by Meta's "Self-Alignment with Instruction Backtranslation" and adapted for optimal performance in training label, point, and answer models. ## Dataset Description ### Language - The dataset contains instructions primarily in Chinese. ### Dataset Structure - **Data Instances**: Each data instance is structured in a JSON format with two fields: `instruction` and `output`. - Example: `{"instruction": "如何评价祁又一自编自导的电影《鸽子小姐》?", "output": "《鸽子小姐》是一部由祁又一自编自导的电影。..."}` - **Data Split**: The dataset is comprised of three subsets: - `wudao.jsonl`: 139,852 instances - `wanjuan.jsonl`: 328,294 instances - `skypile.jsonl`: 71,560 instances ### Data Characteristics - The dataset is designed to provide high-quality instructional data for language model training, focusing on enhancing the quality and applicability of the data. ## Methodology Our approach closely follows the self-alignment method ådescribed by Meta, with adaptations to optimize the process: 1. **Seed Data Selection and Model Training**: Initially, appropriate seed data are selected and inverted to train a Label Model on a base model(Yi Base). Concurrently, using the same seed data, a Primary Chat model is trained following the Supervised Fine-Tuning (SFT) method typical of chat models. 3. **Labeling Unlabeled Data**: The Label Model is then used to annotate preliminarily cleansed Primary data. Cleansing involves filtering based on perplexity (ppl) and length, discarding data exceeding 512 tokens. 4. **Instruction Data Generation**: Post-annotation, we obtain our first version of Labeled data. Unlike the original project where both instruction and output data pairs are fed into Primary Chat Model for scoring, our replication revealed limitations in Primary Chat's ability to discern high-quality instructions. We innovated by scoring only the instruction component, effectively filtering out noise and selecting high-quality instructions. 5. **Output Data Refinement**: Upon manual inspection, we identified a mismatch between the Primary Data (used as output) and the standard requirements for output in instruction data. To address this, we introduced an additional step: refining the output data. Using Primary Chat's capabilities, the output (originally unlabeled data) is adjusted according to the instructions, making it more suitable as output for the instruction data. 6. **Framework Completion**: Our methodology concludes with the acquisition of a substantial volume of instructional data, achieved with minimal resource expenditure. ![Project Framework](Kun_white.Jpeg) ## Usage ### Using the Data - The dataset can be used for training and fine-tuning language models, specifically focusing on instruction understanding and response generation. - Users are encouraged to refer to the project documentation for detailed instructions on utilizing the dataset in the training process. ## Citation If you use this dataset in your research, please cite it as follows: ```bibtex @misc{COIG-Kun, title={Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment}, author={Tianyu, Zheng* and Shuyue, Guo* and Xingwei, Qu and Xinrun, Du and Wenhu, Chen and Jie, Fu and Wenhao, Huang and Ge, Zhang}, year={2023}, publisher={GitHub}, journal={GitHub repository}, howpublished={https://github.com/Zheng0428/COIG-Kun} } ``` ## Acknowledgments This dataset was created by a dedicated team at [M-A-P]. We acknowledge the contributions of all individuals and organizations that made this project possible.
m-a-p/COIG-Kun
[ "task_categories:question-answering", "size_categories:100K<n<1M", "language:zh", "region:us" ]
2023-12-04T06:36:50+00:00
{"language": ["zh"], "size_categories": ["100K<n<1M"], "task_categories": ["question-answering"]}
2023-12-27T07:14:20+00:00
[]
[ "zh" ]
TAGS #task_categories-question-answering #size_categories-100K<n<1M #language-Chinese #region-us
<div align="center"> <img src="Yi_logo.svg" width="150px" style="display: inline-block;"> <img src="URL" width="150px" style="display: inline-block;"> </div> # Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment ## Table of Contents - Overview - Dataset Description - Usage - Citation - Acknowledgments ## Overview The COIG-Kun dataset, part of the COIG-Kun GitHub project, consists of instructional data used for training language models. This dataset was developed following the methodology inspired by Meta's "Self-Alignment with Instruction Backtranslation" and adapted for optimal performance in training label, point, and answer models. ## Dataset Description ### Language - The dataset contains instructions primarily in Chinese. ### Dataset Structure - Data Instances: Each data instance is structured in a JSON format with two fields: 'instruction' and 'output'. - Example: '{"instruction": "如何评价祁又一自编自导的电影《鸽子小姐》?", "output": "《鸽子小姐》是一部由祁又一自编自导的电影。..."}' - Data Split: The dataset is comprised of three subsets: - 'URL': 139,852 instances - 'URL': 328,294 instances - 'URL': 71,560 instances ### Data Characteristics - The dataset is designed to provide high-quality instructional data for language model training, focusing on enhancing the quality and applicability of the data. ## Methodology Our approach closely follows the self-alignment method ådescribed by Meta, with adaptations to optimize the process: 1. Seed Data Selection and Model Training: Initially, appropriate seed data are selected and inverted to train a Label Model on a base model(Yi Base). Concurrently, using the same seed data, a Primary Chat model is trained following the Supervised Fine-Tuning (SFT) method typical of chat models. 3. Labeling Unlabeled Data: The Label Model is then used to annotate preliminarily cleansed Primary data. Cleansing involves filtering based on perplexity (ppl) and length, discarding data exceeding 512 tokens. 4. Instruction Data Generation: Post-annotation, we obtain our first version of Labeled data. Unlike the original project where both instruction and output data pairs are fed into Primary Chat Model for scoring, our replication revealed limitations in Primary Chat's ability to discern high-quality instructions. We innovated by scoring only the instruction component, effectively filtering out noise and selecting high-quality instructions. 5. Output Data Refinement: Upon manual inspection, we identified a mismatch between the Primary Data (used as output) and the standard requirements for output in instruction data. To address this, we introduced an additional step: refining the output data. Using Primary Chat's capabilities, the output (originally unlabeled data) is adjusted according to the instructions, making it more suitable as output for the instruction data. 6. Framework Completion: Our methodology concludes with the acquisition of a substantial volume of instructional data, achieved with minimal resource expenditure. !Project Framework ## Usage ### Using the Data - The dataset can be used for training and fine-tuning language models, specifically focusing on instruction understanding and response generation. - Users are encouraged to refer to the project documentation for detailed instructions on utilizing the dataset in the training process. If you use this dataset in your research, please cite it as follows: ## Acknowledgments This dataset was created by a dedicated team at [M-A-P]. We acknowledge the contributions of all individuals and organizations that made this project possible.
[ "# Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment", "## Table of Contents\n- Overview\n- Dataset Description\n- Usage\n- Citation\n- Acknowledgments", "## Overview\nThe COIG-Kun dataset, part of the COIG-Kun GitHub project, consists of instructional data used for training language models. This dataset was developed following the methodology inspired by Meta's \"Self-Alignment with Instruction Backtranslation\" and adapted for optimal performance in training label, point, and answer models.", "## Dataset Description", "### Language\n- The dataset contains instructions primarily in Chinese.", "### Dataset Structure\n- Data Instances: Each data instance is structured in a JSON format with two fields: 'instruction' and 'output'. \n - Example: '{\"instruction\": \"如何评价祁又一自编自导的电影《鸽子小姐》?\", \"output\": \"《鸽子小姐》是一部由祁又一自编自导的电影。...\"}'\n- Data Split: The dataset is comprised of three subsets:\n - 'URL': 139,852 instances\n - 'URL': 328,294 instances\n - 'URL': 71,560 instances", "### Data Characteristics\n- The dataset is designed to provide high-quality instructional data for language model training, focusing on enhancing the quality and applicability of the data.", "## Methodology\n\nOur approach closely follows the self-alignment method ådescribed by Meta, with adaptations to optimize the process:\n\n1. Seed Data Selection and Model Training: Initially, appropriate seed data are selected and inverted to train a Label Model on a base model(Yi Base). Concurrently, using the same seed data, a Primary Chat model is trained following the Supervised Fine-Tuning (SFT) method typical of chat models.\n\n3. Labeling Unlabeled Data: The Label Model is then used to annotate preliminarily cleansed Primary data. Cleansing involves filtering based on perplexity (ppl) and length, discarding data exceeding 512 tokens.\n\n4. Instruction Data Generation: Post-annotation, we obtain our first version of Labeled data. Unlike the original project where both instruction and output data pairs are fed into Primary Chat Model for scoring, our replication revealed limitations in Primary Chat's ability to discern high-quality instructions. We innovated by scoring only the instruction component, effectively filtering out noise and selecting high-quality instructions.\n\n5. Output Data Refinement: Upon manual inspection, we identified a mismatch between the Primary Data (used as output) and the standard requirements for output in instruction data. To address this, we introduced an additional step: refining the output data. Using Primary Chat's capabilities, the output (originally unlabeled data) is adjusted according to the instructions, making it more suitable as output for the instruction data.\n\n6. Framework Completion: Our methodology concludes with the acquisition of a substantial volume of instructional data, achieved with minimal resource expenditure.\n\n\n!Project Framework", "## Usage", "### Using the Data\n- The dataset can be used for training and fine-tuning language models, specifically focusing on instruction understanding and response generation.\n- Users are encouraged to refer to the project documentation for detailed instructions on utilizing the dataset in the training process.\n\nIf you use this dataset in your research, please cite it as follows:", "## Acknowledgments\nThis dataset was created by a dedicated team at [M-A-P]. We acknowledge the contributions of all individuals and organizations that made this project possible." ]
[ "TAGS\n#task_categories-question-answering #size_categories-100K<n<1M #language-Chinese #region-us \n", "# Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment", "## Table of Contents\n- Overview\n- Dataset Description\n- Usage\n- Citation\n- Acknowledgments", "## Overview\nThe COIG-Kun dataset, part of the COIG-Kun GitHub project, consists of instructional data used for training language models. This dataset was developed following the methodology inspired by Meta's \"Self-Alignment with Instruction Backtranslation\" and adapted for optimal performance in training label, point, and answer models.", "## Dataset Description", "### Language\n- The dataset contains instructions primarily in Chinese.", "### Dataset Structure\n- Data Instances: Each data instance is structured in a JSON format with two fields: 'instruction' and 'output'. \n - Example: '{\"instruction\": \"如何评价祁又一自编自导的电影《鸽子小姐》?\", \"output\": \"《鸽子小姐》是一部由祁又一自编自导的电影。...\"}'\n- Data Split: The dataset is comprised of three subsets:\n - 'URL': 139,852 instances\n - 'URL': 328,294 instances\n - 'URL': 71,560 instances", "### Data Characteristics\n- The dataset is designed to provide high-quality instructional data for language model training, focusing on enhancing the quality and applicability of the data.", "## Methodology\n\nOur approach closely follows the self-alignment method ådescribed by Meta, with adaptations to optimize the process:\n\n1. Seed Data Selection and Model Training: Initially, appropriate seed data are selected and inverted to train a Label Model on a base model(Yi Base). Concurrently, using the same seed data, a Primary Chat model is trained following the Supervised Fine-Tuning (SFT) method typical of chat models.\n\n3. Labeling Unlabeled Data: The Label Model is then used to annotate preliminarily cleansed Primary data. Cleansing involves filtering based on perplexity (ppl) and length, discarding data exceeding 512 tokens.\n\n4. Instruction Data Generation: Post-annotation, we obtain our first version of Labeled data. Unlike the original project where both instruction and output data pairs are fed into Primary Chat Model for scoring, our replication revealed limitations in Primary Chat's ability to discern high-quality instructions. We innovated by scoring only the instruction component, effectively filtering out noise and selecting high-quality instructions.\n\n5. Output Data Refinement: Upon manual inspection, we identified a mismatch between the Primary Data (used as output) and the standard requirements for output in instruction data. To address this, we introduced an additional step: refining the output data. Using Primary Chat's capabilities, the output (originally unlabeled data) is adjusted according to the instructions, making it more suitable as output for the instruction data.\n\n6. Framework Completion: Our methodology concludes with the acquisition of a substantial volume of instructional data, achieved with minimal resource expenditure.\n\n\n!Project Framework", "## Usage", "### Using the Data\n- The dataset can be used for training and fine-tuning language models, specifically focusing on instruction understanding and response generation.\n- Users are encouraged to refer to the project documentation for detailed instructions on utilizing the dataset in the training process.\n\nIf you use this dataset in your research, please cite it as follows:", "## Acknowledgments\nThis dataset was created by a dedicated team at [M-A-P]. We acknowledge the contributions of all individuals and organizations that made this project possible." ]
[ 35, 27, 24, 81, 4, 15, 142, 42, 388, 3, 77, 41 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-100K<n<1M #language-Chinese #region-us \n# Kun: Answer Polishment Saves Your Time for Using Intruction Backtranslation on Self-Alignment## Table of Contents\n- Overview\n- Dataset Description\n- Usage\n- Citation\n- Acknowledgments## Overview\nThe COIG-Kun dataset, part of the COIG-Kun GitHub project, consists of instructional data used for training language models. This dataset was developed following the methodology inspired by Meta's \"Self-Alignment with Instruction Backtranslation\" and adapted for optimal performance in training label, point, and answer models.## Dataset Description### Language\n- The dataset contains instructions primarily in Chinese.### Dataset Structure\n- Data Instances: Each data instance is structured in a JSON format with two fields: 'instruction' and 'output'. \n - Example: '{\"instruction\": \"如何评价祁又一自编自导的电影《鸽子小姐》?\", \"output\": \"《鸽子小姐》是一部由祁又一自编自导的电影。...\"}'\n- Data Split: The dataset is comprised of three subsets:\n - 'URL': 139,852 instances\n - 'URL': 328,294 instances\n - 'URL': 71,560 instances### Data Characteristics\n- The dataset is designed to provide high-quality instructional data for language model training, focusing on enhancing the quality and applicability of the data." ]
c2c27d66cf8bb9f6c3848dcc56afe890590fc548
<div align="center"> <img src="Yi_logo.svg" width="150px" style="display: inline-block;"> <img src="siat-logo.jpg" width="150px" style="display: inline-block;"> <img src="m-a-p.png" width="150px" style="display: inline-block;"> </div> # COIG-CQIA:Quality is All you need for Chinese Instruction Fine-tuning <!-- Provide a quick summary of the dataset. --> ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> 欢迎来到COIG-CQIA,COIG-CQIA全称为**Chinese Open Instruction Generalist - Quality is All You Need**, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供**高质量**且符合**人类交互行为**的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。 Welcome to the COIG-CQIA project page. COIG-CQIA stands for **Chinese Open Instruction Generalist - Quality is All You Need**, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with **high-quality** and **human interaction-aligned** instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance. - **Curated by:** 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。 - **Language(s) (NLP):** 本数据集主要语言为中文。 - **License:** [More Information Needed] 本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流! ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> 本数据集适用于指令微调,训练模型具备响应指令的能力。 ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## 数据 ### 数据格式 ```json { "instruction": "示例问题或者指令。", "input": "示例问题或指令的补充。", "output": "对输入的回复。", "task_type": { "major": ["问答"], "minor": ["百科问答"] }, "domain": ["百科", "医疗"], "answer_from": "human", "human_verified": true, "copyright": "作者及版权信息。", } ``` ### 数据字段 - `instruction`: 用于输入的指令或者问题。 - `input`: 问题或指令的补充内容。 - `output`: 输入对应的回答。 - `task_type`: 表示该数据所属的主要任务类型和细分任务类型。 - `domain`: 该数据所属领域。 - `answer_from`: 回答是人类撰写的还是大模型撰写的,本数据集中绝大部分是由人类撰写的回答,少部分由大模型生成(经过了人工验证)。 - `human_verified`: 该数据是否又人类核验过。 - `copyright`: 包括该数据的版权信息,包括作者等。 当前版本的数据字段中仍有不完善的部分,我们将在近期的下一版本中补充。 ### 数据详情 <details> <summary><b>社交媒体&论坛</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 知乎 | 8837 | [[网址链接]](https://www.zhihu.com/) | 经过多阶段的数据质量筛选和人工验证。 | | 豆瓣 | 3132 | [[网址链接]](https://www.douban.com/) | 人工撰写多样的prompt模板构造而成。 | | 小红书 | 1508 | [[网址链接]](https://www.xiaohongshu.com/explore) | 人工撰写多样的prompt模板构造而成。 | | Segmentfault | 458 | [[网址链接]](https://segmentfault.com/) | 规则方式清洗与筛选,并经过人工验证。 | | **总量** | **13935** | - | - | </details> <details> <summary><b>通用百科</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 百科文章 | 980 | 从网络中收集。[[网址链接]](https://10why.net/) [[网址链接]](https://www.eetree.cn/wiki/eebaike) [[网址链接]](https://www.nongyie.com/) [[网址链接]](https://www.gkket.com/gkwk/) | 规则方式清洗与筛选,并经过人工验证。 | | 中国大百科全书 | 1706 | [[网址链接]](https://www.zgbk.com/) | 人工撰写多样的prompt模板构造而成。 | | wikiHow中文 | 1876 | [[网址链接]](https://zh.wikihow.com/首页)&[[公开数据集]](https://github.com/esbatmop/MNBVC/tree/main) | 规则方式清洗与筛选。 | | **总量** | **4571** | - | - | </details> </details> <details> <summary><b>通用NLP任务</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | COIG-PC-Core | 3000 | [[Open Dataset]](https://huggingface.co/datasets/BAAI/COIG-PC-core) | 人工验证数据质量。 | | **总量** | **3000** | - | - | </details> <details> <summary><b>考试&试题</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 高考&中考 | 2000 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG) | - | | 研究生入学考试 | 475 | 从网络中收集 | 规则方式清洗与筛选。 | | 逻辑推理题 | 422 | 从网络中收集 | 规则方式清洗与筛选。 | | **总量** | **2897** | - | - | </details> <details> <summary><b>人类价值观</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 100poison | 906 | [[公开数据集]](https://modelscope.cn/datasets/damo/100PoisonMpts/summary) | - | | COIG-human-value | 101 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG) | 经人工审核数据质量 | | **总量** | **1007** | - | - | </details> <details> <summary><b>中国传统文化</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 中华传统文化试题 | 232 | 从网络中收集 | 规则方式清洗与筛选,并经过人工验证。 | | 成语释义 | 112 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 | | 古诗词撰写 | 47 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 | | 文言文互译 | 112 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 | | **总量** | **503** | - | - | </details> <details> <summary><b>金融&经管领域</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | MBA百科 | 10689 | [[网址链接]](https://wiki.mbalib.com/wiki/首页) | 人工撰写多样的prompt模板构造而成。 | | 金融NLP任务 | 600 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG-PC) | 人工核验数据质量 | | **总量** | **11289** | - | - | </details> <details> <summary><b>医疗领域</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 医疗百科 | 8351 | [[网址链接]](www.baikemy.com) | 人工撰写多样的prompt模板构造而成。 | | 医疗文章 | 186 | [[网址链接]](https://51zyzy.com/article/list.html) [[网址链接]](https://baobao.baidu.com/dailyjnl/list/13.html) | 规则方式清洗与筛选。 | | **总量** | **8537** | - | - | </details> <details> <summary><b>法律领域</b></summary> | 类别 | 数量 | 来源 | 构造方式 | | ----------------- | -------- | ------ | --------------------------------------- | | 法律研究生入学考试 | 2645 | 从网络中收集 | 规则方式清洗与筛选。 | | **总量** | **2645** | - | - | </details> ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> 如果本项目为您的研究带来了帮助,欢迎引用! ```bibtex @misc{COIG-CQIA, author = {}, title = {COIG-CQIA: Quality is All you need for Chinese Instruction Fine-tuning}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/paralym/COIG-CQIA}}, } ``` 本数据集中也包含了以下公开数据: ```bibtex @article{zhang2023chinese, title={Chinese open instruction generalist: A preliminary release}, author={Zhang, Ge and Shi, Yemin and Liu, Ruibo and Yuan, Ruibin and Li, Yizhi and Dong, Siwei and Shu, Yu and Li, Zhaoqun and Wang, Zekun and Lin, Chenghua and others}, journal={arXiv preprint arXiv:2304.07987}, year={2023} } @misc{Firefly, author = {Jianxin Yang}, title = {Firefly(流萤): 中文对话式大语言模型}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/yangjianxin1/Firefly}}, } @misc{xu2023cvalues, title={CValues: Measuring the Values of Chinese Large Language Models from Safety to Responsibility}, author={Guohai Xu and Jiayi Liu and Ming Yan and Haotian Xu and Jinghui Si and Zhuoran Zhou and Peng Yi and Xing Gao and Jitao Sang and Rong Zhang and Ji Zhang and Chao Peng and Fei Huang and Jingren Zhou}, year={2023}, eprint={2307.09705}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
m-a-p/COIG-CQIA
[ "task_categories:question-answering", "task_categories:text-classification", "task_categories:text-generation", "task_categories:conversational", "task_categories:text2text-generation", "size_categories:10K<n<100K", "language:zh", "arxiv:2307.09705", "region:us" ]
2023-12-04T07:04:37+00:00
{"language": ["zh"], "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "text-classification", "text-generation", "conversational", "text2text-generation"]}
2024-01-29T17:07:42+00:00
[ "2307.09705" ]
[ "zh" ]
TAGS #task_categories-question-answering #task_categories-text-classification #task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #arxiv-2307.09705 #region-us
![](Yi_logo.svg) ![](URL) ![](URL) COIG-CQIA:Quality is All you need for Chinese Instruction Fine-tuning ===================================================================== Dataset Details --------------- ### Dataset Description 欢迎来到COIG-CQIA,COIG-CQIA全称为Chinese Open Instruction Generalist - Quality is All You Need, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供高质量且符合人类交互行为的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。 Welcome to the COIG-CQIA project page. COIG-CQIA stands for Chinese Open Instruction Generalist - Quality is All You Need, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with high-quality and human interaction-aligned instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance. * Curated by: 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。 * Language(s) (NLP): 本数据集主要语言为中文。 * License: 本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流! Uses ---- ### Direct Use 本数据集适用于指令微调,训练模型具备响应指令的能力。 ### Out-of-Scope Use 数据 -- ### 数据格式 ### 数据字段 * 'instruction': 用于输入的指令或者问题。 * 'input': 问题或指令的补充内容。 * 'output': 输入对应的回答。 * 'task\_type': 表示该数据所属的主要任务类型和细分任务类型。 * 'domain': 该数据所属领域。 * 'answer\_from': 回答是人类撰写的还是大模型撰写的,本数据集中绝大部分是由人类撰写的回答,少部分由大模型生成(经过了人工验证)。 * 'human\_verified': 该数据是否又人类核验过。 * 'copyright': 包括该数据的版权信息,包括作者等。 当前版本的数据字段中仍有不完善的部分,我们将在近期的下一版本中补充。 ### 数据详情 **社交媒体&论坛** **通用百科** **通用NLP任务** **考试&试题** **人类价值观** **中国传统文化** **金融&经管领域** **医疗领域** **法律领域** ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. 如果本项目为您的研究带来了帮助,欢迎引用! 本数据集中也包含了以下公开数据:
[ "### Dataset Description\n\n\n欢迎来到COIG-CQIA,COIG-CQIA全称为Chinese Open Instruction Generalist - Quality is All You Need, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供高质量且符合人类交互行为的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。\n\n\nWelcome to the COIG-CQIA project page. COIG-CQIA stands for Chinese Open Instruction Generalist - Quality is All You Need, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with high-quality and human interaction-aligned instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance.\n\n\n* Curated by: 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。\n* Language(s) (NLP): 本数据集主要语言为中文。\n* License:\n\n\n本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流!\n\n\nUses\n----", "### Direct Use\n\n\n本数据集适用于指令微调,训练模型具备响应指令的能力。", "### Out-of-Scope Use\n\n\n数据\n--", "### 数据格式", "### 数据字段\n\n\n* 'instruction': 用于输入的指令或者问题。\n* 'input': 问题或指令的补充内容。\n* 'output': 输入对应的回答。\n* 'task\\_type': 表示该数据所属的主要任务类型和细分任务类型。\n* 'domain': 该数据所属领域。\n* 'answer\\_from': 回答是人类撰写的还是大模型撰写的,本数据集中绝大部分是由人类撰写的回答,少部分由大模型生成(经过了人工验证)。\n* 'human\\_verified': 该数据是否又人类核验过。\n* 'copyright': 包括该数据的版权信息,包括作者等。\n\n\n当前版本的数据字段中仍有不完善的部分,我们将在近期的下一版本中补充。", "### 数据详情\n\n\n\n**社交媒体&论坛**\n\n\n\n**通用百科**\n\n\n\n\n**通用NLP任务**\n\n\n\n**考试&试题**\n\n\n\n**人类价值观**\n\n\n\n**中国传统文化**\n\n\n\n**金融&经管领域**\n\n\n\n**医疗领域**\n\n\n\n**法律领域**", "### Recommendations\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n\n如果本项目为您的研究带来了帮助,欢迎引用!\n\n\n本数据集中也包含了以下公开数据:" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-classification #task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #arxiv-2307.09705 #region-us \n", "### Dataset Description\n\n\n欢迎来到COIG-CQIA,COIG-CQIA全称为Chinese Open Instruction Generalist - Quality is All You Need, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供高质量且符合人类交互行为的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。\n\n\nWelcome to the COIG-CQIA project page. COIG-CQIA stands for Chinese Open Instruction Generalist - Quality is All You Need, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with high-quality and human interaction-aligned instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance.\n\n\n* Curated by: 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。\n* Language(s) (NLP): 本数据集主要语言为中文。\n* License:\n\n\n本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流!\n\n\nUses\n----", "### Direct Use\n\n\n本数据集适用于指令微调,训练模型具备响应指令的能力。", "### Out-of-Scope Use\n\n\n数据\n--", "### 数据格式", "### 数据字段\n\n\n* 'instruction': 用于输入的指令或者问题。\n* 'input': 问题或指令的补充内容。\n* 'output': 输入对应的回答。\n* 'task\\_type': 表示该数据所属的主要任务类型和细分任务类型。\n* 'domain': 该数据所属领域。\n* 'answer\\_from': 回答是人类撰写的还是大模型撰写的,本数据集中绝大部分是由人类撰写的回答,少部分由大模型生成(经过了人工验证)。\n* 'human\\_verified': 该数据是否又人类核验过。\n* 'copyright': 包括该数据的版权信息,包括作者等。\n\n\n当前版本的数据字段中仍有不完善的部分,我们将在近期的下一版本中补充。", "### 数据详情\n\n\n\n**社交媒体&论坛**\n\n\n\n**通用百科**\n\n\n\n\n**通用NLP任务**\n\n\n\n**考试&试题**\n\n\n\n**人类价值观**\n\n\n\n**中国传统文化**\n\n\n\n**金融&经管领域**\n\n\n\n**医疗领域**\n\n\n\n**法律领域**", "### Recommendations\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n\n如果本项目为您的研究带来了帮助,欢迎引用!\n\n\n本数据集中也包含了以下公开数据:" ]
[ 88, 399, 19, 12, 5, 184, 50, 56 ]
[ "passage: TAGS\n#task_categories-question-answering #task_categories-text-classification #task_categories-text-generation #task_categories-conversational #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #arxiv-2307.09705 #region-us \n### Dataset Description\n\n\n欢迎来到COIG-CQIA,COIG-CQIA全称为Chinese Open Instruction Generalist - Quality is All You Need, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供高质量且符合人类交互行为的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。\n\n\nWelcome to the COIG-CQIA project page. COIG-CQIA stands for Chinese Open Instruction Generalist - Quality is All You Need, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with high-quality and human interaction-aligned instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance.\n\n\n* Curated by: 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。\n* Language(s) (NLP): 本数据集主要语言为中文。\n* License:\n\n\n本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流!\n\n\nUses\n----### Direct Use\n\n\n本数据集适用于指令微调,训练模型具备响应指令的能力。" ]
f1fb238f308b78ebe8581b0cd378533bd86fb174
# Dataset Card for "dpo_openhermes" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ThWu/dpo_openhermes
[ "region:us" ]
2023-12-04T07:32:39+00:00
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 557125321, "num_examples": 182859}], "download_size": 289321706, "dataset_size": 557125321}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T07:37:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dpo_openhermes" More Information needed
[ "# Dataset Card for \"dpo_openhermes\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dpo_openhermes\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"dpo_openhermes\"\n\nMore Information needed" ]
3d0b7e9a8a1013fcdc36b13215a8f2b390fe3e0d
# Dataset Card for "mnli-binary" This dataset is the [mnli-3way](https://huggingface.co/datasets/AntoineBlanot/mnli-3way) dataset where the `contradiction` and `neutral` classes has been merged together as a `non-entailment` class. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AntoineBlanot/mnli-binary
[ "region:us" ]
2023-12-04T07:51:57+00:00
{"dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 76452262, "num_examples": 392702}, {"name": "test", "num_bytes": 1878757, "num_examples": 9815}], "download_size": 51169514, "dataset_size": 78331019}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-04T08:06:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mnli-binary" This dataset is the mnli-3way dataset where the 'contradiction' and 'neutral' classes has been merged together as a 'non-entailment' class. More Information needed
[ "# Dataset Card for \"mnli-binary\"\nThis dataset is the mnli-3way dataset where the 'contradiction' and 'neutral' classes has been merged together as a 'non-entailment' class.\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mnli-binary\"\nThis dataset is the mnli-3way dataset where the 'contradiction' and 'neutral' classes has been merged together as a 'non-entailment' class.\n\nMore Information needed" ]
[ 6, 55 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mnli-binary\"\nThis dataset is the mnli-3way dataset where the 'contradiction' and 'neutral' classes has been merged together as a 'non-entailment' class.\n\nMore Information needed" ]
c4ee82946034405d7bc81632cae9fe3950f17cf4
# Dataset Card for "mnli-contrast" This dataset is the [mnli-3way](https://huggingface.co/datasets/AntoineBlanot/mnli-3way) dataset with an additional `instruction` feature. This new feature along with its related `label_name` expresses how the `premise` and `hypothesis` features are related in the original dataset. The following explains how the mapping is done: ### If the original example was of class `entailment` Two data points will be related to that example. One is the positive example (i.e., `label_name` == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise." The other is the negative example (i.e., `label_name` == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise." ### If the original example was of class `contradiction` or `neutral` Two data points will be related to that example. One is the positive example (i.e., `label_name` == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise." The other is the negative example (i.e., `label_name` == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise." This dataset is double the size of this original dataset because each is related to a positive and negative instruction. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AntoineBlanot/mnli-contrast
[ "region:us" ]
2023-12-04T07:52:32+00:00
{"dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "label_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 254483428, "num_examples": 785404}, {"name": "test", "num_bytes": 6297986, "num_examples": 19630}], "download_size": 54354034, "dataset_size": 260781414}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-04T08:06:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mnli-contrast" This dataset is the mnli-3way dataset with an additional 'instruction' feature. This new feature along with its related 'label_name' expresses how the 'premise' and 'hypothesis' features are related in the original dataset. The following explains how the mapping is done: ### If the original example was of class 'entailment' Two data points will be related to that example. One is the positive example (i.e., 'label_name' == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise." The other is the negative example (i.e., 'label_name' == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise." ### If the original example was of class 'contradiction' or 'neutral' Two data points will be related to that example. One is the positive example (i.e., 'label_name' == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise." The other is the negative example (i.e., 'label_name' == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise." This dataset is double the size of this original dataset because each is related to a positive and negative instruction. More Information needed
[ "# Dataset Card for \"mnli-contrast\"\nThis dataset is the mnli-3way dataset with an additional 'instruction' feature.\nThis new feature along with its related 'label_name' expresses how the 'premise' and 'hypothesis' features are related in the original dataset.\n\nThe following explains how the mapping is done:", "### If the original example was of class 'entailment'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"", "### If the original example was of class 'contradiction' or 'neutral'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\n\nThis dataset is double the size of this original dataset because each is related to a positive and negative instruction.\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mnli-contrast\"\nThis dataset is the mnli-3way dataset with an additional 'instruction' feature.\nThis new feature along with its related 'label_name' expresses how the 'premise' and 'hypothesis' features are related in the original dataset.\n\nThe following explains how the mapping is done:", "### If the original example was of class 'entailment'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"", "### If the original example was of class 'contradiction' or 'neutral'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\n\nThis dataset is double the size of this original dataset because each is related to a positive and negative instruction.\n\nMore Information needed" ]
[ 6, 79, 151, 182 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mnli-contrast\"\nThis dataset is the mnli-3way dataset with an additional 'instruction' feature.\nThis new feature along with its related 'label_name' expresses how the 'premise' and 'hypothesis' features are related in the original dataset.\n\nThe following explains how the mapping is done:### If the original example was of class 'entailment'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"### If the original example was of class 'contradiction' or 'neutral'\nTwo data points will be related to that example.\n\nOne is the positive example (i.e., 'label_name' == \"positive\") which assign to it the folowing instruction: \"The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise.\"\nThe other is the negative example (i.e., 'label_name' == \"negative\") which assign to it the folowing instruction: \"The meaning of the hypothesis is logically inferred from the meaning of the premise.\"\n\nThis dataset is double the size of this original dataset because each is related to a positive and negative instruction.\n\nMore Information needed" ]
3d7bc15da8c17e617df3df2c63772906934b37c4
# Dataset Card for "Thai-Pos-prompt" This dataset made to create prompt for Thai part-of-speech tagging. - The Train set made from [blackboard treebank](https://bitbucket.org/kaamanita/blackboard-treebank) that used own tags but we use Universal Dependencies tags, so we convert those tags into Universal Dependencies tags. [See blackboard treebank tags to Universal Dependencies tags](https://github.com/PyThaiNLP/pythainlp/blob/dev/pythainlp/tag/blackboard.py#L56C5-L56C17) - The test set from the Parallel Universal Dependencies (PUD) treebanks. See more [https://github.com/UniversalDependencies/UD_Thai-PUD](https://github.com/UniversalDependencies/UD_Thai-PUD) ## Template ``` Inputs: จงสร้างประโยคตามโครงสร้าง {pos}: Targets: Thai sentence ``` pos: [All tag](https://universaldependencies.org/u/pos/) Source code for create dataset: [https://github.com/PyThaiNLP/support-aya-datasets/blob/main/pos/merge.ipynb](https://github.com/PyThaiNLP/support-aya-datasets/blob/main/pos/merge.ipynb)
pythainlp/Thai-Pos-prompt
[ "task_categories:text2text-generation", "task_categories:text-generation", "size_categories:10K<n<100K", "language:th", "license:cc-by-sa-3.0", "region:us" ]
2023-12-04T07:53:11+00:00
{"language": ["th"], "license": "cc-by-sa-3.0", "size_categories": ["10K<n<100K"], "task_categories": ["text2text-generation", "text-generation"], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 539162, "num_examples": 1000}, {"name": "train", "num_bytes": 29246726, "num_examples": 130454}], "download_size": 6418943, "dataset_size": 29785888}}
2023-12-04T07:59:35+00:00
[]
[ "th" ]
TAGS #task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us
# Dataset Card for "Thai-Pos-prompt" This dataset made to create prompt for Thai part-of-speech tagging. - The Train set made from blackboard treebank that used own tags but we use Universal Dependencies tags, so we convert those tags into Universal Dependencies tags. See blackboard treebank tags to Universal Dependencies tags - The test set from the Parallel Universal Dependencies (PUD) treebanks. See more URL ## Template pos: All tag Source code for create dataset: URL
[ "# Dataset Card for \"Thai-Pos-prompt\"\nThis dataset made to create prompt for Thai part-of-speech tagging.\n\n- The Train set made from blackboard treebank that used own tags but we use Universal Dependencies tags, so we convert those tags into Universal Dependencies tags. See blackboard treebank tags to Universal Dependencies tags\n- The test set from the Parallel Universal Dependencies (PUD) treebanks. See more URL", "## Template\n\npos: All tag\n\n\nSource code for create dataset: URL" ]
[ "TAGS\n#task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us \n", "# Dataset Card for \"Thai-Pos-prompt\"\nThis dataset made to create prompt for Thai part-of-speech tagging.\n\n- The Train set made from blackboard treebank that used own tags but we use Universal Dependencies tags, so we convert those tags into Universal Dependencies tags. See blackboard treebank tags to Universal Dependencies tags\n- The test set from the Parallel Universal Dependencies (PUD) treebanks. See more URL", "## Template\n\npos: All tag\n\n\nSource code for create dataset: URL" ]
[ 58, 103, 14 ]
[ "passage: TAGS\n#task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us \n# Dataset Card for \"Thai-Pos-prompt\"\nThis dataset made to create prompt for Thai part-of-speech tagging.\n\n- The Train set made from blackboard treebank that used own tags but we use Universal Dependencies tags, so we convert those tags into Universal Dependencies tags. See blackboard treebank tags to Universal Dependencies tags\n- The test set from the Parallel Universal Dependencies (PUD) treebanks. See more URL## Template\n\npos: All tag\n\n\nSource code for create dataset: URL" ]
40a9a20245304307a4100400bedac59b70cf7c6b
configs: - config_name: default data_files: - split: train path: train/neg/*, train/pos/* - split: test path: test/neg/*, test/pos/* dataset_info: features: - name: text dtype: string - name: label dtype: integer (1 for positive, -1 for negative) splits: - name: train num_examples: 20027 - name: test num_examples: 2223 download_size: 79.1MB dataset_size: 773,4MB # Dataset Card for "DBRD: Dutch Book Reviews Dataset" Translation of the [Dutch Book Review Dataset (DBRD)](https://github.com/benjaminvdb/DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the [Large Movie Review Dataset](http://ai.stanford.edu/~amaas/data/sentiment/). The dataset and the scripts used for scraping the reviews from [Hebban](Hebban), a Dutch platform for book enthusiasts, can be found in the [DBRD GitHub repository](https://github.com/benjaminvdb/DBRD). # Labels Distribution of labels positive/negative/neutral in rounded percentages. ``` training: 50/50/ 0 test: 50/50/ 0 ``` # Attribution Please use the following citation when making use of this dataset in your work: ```citation @article{DBLP:journals/corr/abs-1910-00896, author = {Benjamin van der Burgh and Suzan Verberne}, title = {The merits of Universal Language Model Fine-tuning for Small Datasets - a case with Dutch book reviews}, journal = {CoRR}, volume = {abs/1910.00896}, year = {2019}, url = {http://arxiv.org/abs/1910.00896}, archivePrefix = {arXiv}, eprint = {1910.00896}, timestamp = {Fri, 04 Oct 2019 12:28:06 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1910-00896.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` # Acknowledgements (as per GIT repository) This dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset. And of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :) --- license: mit ---
ebrigham/DBRD
[ "task_categories:text-classification", "language:nl", "license:mit", "arxiv:1910.00896", "region:us" ]
2023-12-04T07:57:07+00:00
{"language": ["nl"], "license": "mit", "task_categories": ["text-classification"], "pretty_name": "DBRD"}
2023-12-04T08:31:13+00:00
[ "1910.00896" ]
[ "nl" ]
TAGS #task_categories-text-classification #language-Dutch #license-mit #arxiv-1910.00896 #region-us
configs: - config_name: default data_files: - split: train path: train/neg/*, train/pos/* - split: test path: test/neg/*, test/pos/* dataset_info: features: - name: text dtype: string - name: label dtype: integer (1 for positive, -1 for negative) splits: - name: train num_examples: 20027 - name: test num_examples: 2223 download_size: 79.1MB dataset_size: 773,4MB # Dataset Card for "DBRD: Dutch Book Reviews Dataset" Translation of the Dutch Book Review Dataset (DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the Large Movie Review Dataset. The dataset and the scripts used for scraping the reviews from Hebban, a Dutch platform for book enthusiasts, can be found in the DBRD GitHub repository. # Labels Distribution of labels positive/negative/neutral in rounded percentages. # Attribution Please use the following citation when making use of this dataset in your work: # Acknowledgements (as per GIT repository) This dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset. And of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :) --- license: mit ---
[ "# Dataset Card for \"DBRD: Dutch Book Reviews Dataset\"\n\n\nTranslation of the Dutch Book Review Dataset (DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the Large Movie Review Dataset.\n\nThe dataset and the scripts used for scraping the reviews from Hebban, a Dutch platform for book enthusiasts, can be found in the DBRD GitHub repository.", "# Labels\n\nDistribution of labels positive/negative/neutral in rounded percentages.", "# Attribution\n\nPlease use the following citation when making use of this dataset in your work:", "# Acknowledgements (as per GIT repository)\n\nThis dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset.\n\nAnd of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :)\n\n---\nlicense: mit\n---" ]
[ "TAGS\n#task_categories-text-classification #language-Dutch #license-mit #arxiv-1910.00896 #region-us \n", "# Dataset Card for \"DBRD: Dutch Book Reviews Dataset\"\n\n\nTranslation of the Dutch Book Review Dataset (DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the Large Movie Review Dataset.\n\nThe dataset and the scripts used for scraping the reviews from Hebban, a Dutch platform for book enthusiasts, can be found in the DBRD GitHub repository.", "# Labels\n\nDistribution of labels positive/negative/neutral in rounded percentages.", "# Attribution\n\nPlease use the following citation when making use of this dataset in your work:", "# Acknowledgements (as per GIT repository)\n\nThis dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset.\n\nAnd of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :)\n\n---\nlicense: mit\n---" ]
[ 37, 114, 19, 19, 179 ]
[ "passage: TAGS\n#task_categories-text-classification #language-Dutch #license-mit #arxiv-1910.00896 #region-us \n# Dataset Card for \"DBRD: Dutch Book Reviews Dataset\"\n\n\nTranslation of the Dutch Book Review Dataset (DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the Large Movie Review Dataset.\n\nThe dataset and the scripts used for scraping the reviews from Hebban, a Dutch platform for book enthusiasts, can be found in the DBRD GitHub repository.# Labels\n\nDistribution of labels positive/negative/neutral in rounded percentages.# Attribution\n\nPlease use the following citation when making use of this dataset in your work:# Acknowledgements (as per GIT repository)\n\nThis dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset.\n\nAnd of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :)\n\n---\nlicense: mit\n---" ]
3306ed6d1c7985e9581b0d9e2dfb5f82adf5d5fd
# Dataset Card for "mnli-3way" This dataset is the [multi_nli](https://huggingface.co/datasets/multi_nli) dataset where the labels are: `entailment`, `contradiction` and `neutral`. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AntoineBlanot/mnli-3way
[ "region:us" ]
2023-12-04T08:01:57+00:00
{"dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "label_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 75405059, "num_examples": 392702}, {"name": "test", "num_bytes": 1853683, "num_examples": 9815}], "download_size": 51216284, "dataset_size": 77258742}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-04T08:02:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mnli-3way" This dataset is the multi_nli dataset where the labels are: 'entailment', 'contradiction' and 'neutral'. More Information needed
[ "# Dataset Card for \"mnli-3way\"\nThis dataset is the multi_nli dataset where the labels are: 'entailment', 'contradiction' and 'neutral'.\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mnli-3way\"\nThis dataset is the multi_nli dataset where the labels are: 'entailment', 'contradiction' and 'neutral'.\nMore Information needed" ]
[ 6, 47 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mnli-3way\"\nThis dataset is the multi_nli dataset where the labels are: 'entailment', 'contradiction' and 'neutral'.\nMore Information needed" ]
27bb1fd9ffe94caa0625426a22a1e863fa72ef23
## 垃圾信息检测 垃圾邮件, 垃圾短信息, 广告识别. ### 数据来源 数据集从网上收集整理如下: | 数据 | 语言 | 任务类型 | 原始数据/项目地址 | 样本个数 | 原始数据描述 | 替代数据下载地址 | | :--- | :---: | :---: | :---: | :---: | :---: | :---: | | enron_spam | 英语 | 垃圾邮件分类 | [enron_spam_data](https://github.com/MWiechmann/enron_spam_data); [Enron-Spam](https://www2.aueb.gr/users/ion/data/enron-spam/); [spam-mails-dataset](https://www.kaggle.com/datasets/venky73/spam-mails-dataset) | ham: 16545; spam: 17171 | Enron-Spam 数据集是 V. Metsis、I. Androutsopoulos 和 G. Paliouras 收集的绝佳资源 | [SetFit/enron_spam](https://huggingface.co/datasets/SetFit/enron_spam); [enron-spam](https://www.kaggle.com/datasets/wanderfj/enron-spam) | | enron_spam_subset | 英语 | 垃圾邮件分类 | [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset) | ham: 5000; spam: 5000 | | | | ling_spam | 英语 | 垃圾邮件分类 | [lingspam-dataset](https://www.kaggle.com/datasets/mandygu/lingspam-dataset); [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset) | ham: 2172; spam: 433 | Ling-Spam 数据集是从语言学家列表中整理的 2,893 条垃圾邮件和非垃圾邮件消息的集合。 | | | sms_spam | 英语 | 垃圾短信分类 | [SMS Spam Collection](https://archive.ics.uci.edu/dataset/228/sms+spam+collection); [SMS Spam Collection Dataset](https://www.kaggle.com/datasets/uciml/sms-spam-collection-dataset) | ham: 4827; spam: 747 | SMS 垃圾邮件集合是一组公开的 SMS 标记消息,为移动电话垃圾邮件研究而收集。 | [sms_spam](https://huggingface.co/datasets/sms_spam) | | sms_spam_collection | 英语 | 垃圾短信分类 | [spam-emails](https://www.kaggle.com/datasets/abdallahwagih/spam-emails) | ham: 4825; spam: 747 | 该数据集包含电子邮件的集合 | [email-spam-detection-dataset-classification](https://www.kaggle.com/datasets/shantanudhakadd/email-spam-detection-dataset-classification); [spam-identification](https://www.kaggle.com/datasets/amirdhavarshinis/spam-identification); [sms-spam-collection](https://www.kaggle.com/datasets/thedevastator/sms-spam-collection-a-more-diverse-dataset); [spam-or-ham](https://www.kaggle.com/datasets/arunasivapragasam/spam-or-ham) | | spam_assassin | 英语 | 垃圾邮件分类 | [datasets-spam-assassin](https://github.com/stdlib-js/datasets-spam-assassin); [Apache SpamAssassin’s public datasets](https://spamassassin.apache.org/old/publiccorpus/); [Spam or Not Spam Dataset](https://www.kaggle.com/datasets/ozlerhakan/spam-or-not-spam-dataset) | ham: 4150; spam: 1896 | 数据集从[email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset)的completeSpamAssassin.csv文件而来。 | [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset); [talby/SpamAssassin](https://huggingface.co/datasets/talby/spamassassin); [spamassassin-2002](https://www.kaggle.com/datasets/cesaber/spam-email-data-spamassassin-2002) | | spam_base | 英语 | 垃圾邮件分类 | [spambase](https://archive.ics.uci.edu/dataset/94/spambase) | | 将电子邮件分类为垃圾邮件或非垃圾邮件 | [spam-email-data-uci](https://www.kaggle.com/datasets/kaggleprollc/spam-email-data-uci) | | spam_detection | 英语 | 垃圾短信分类 | [Deysi/spam-detection-dataset](https://huggingface.co/datasets/Deysi/spam-detection-dataset) | ham: 5400; spam: 5500 | | | | spam_message | 汉语 | 垃圾短信分类 | [SpamMessage](https://github.com/hrwhisper/SpamMessage) | ham: 720000; spam: 80000 | 其中spam的数据是正确的数据,但是做了脱敏处理(招生电话:xxxxxxxxxxx),这里的 x 可能会成为显著特征。而ham样本像是从普通文本中截断出来充作样本的,建议不要用这些数据。 | | | spam_message_lr | 汉语 | 垃圾短信分类 | [SpamMessagesLR](https://github.com/x-hacker/SpamMessagesLR) | ham: 3983; spam: 6990 | | | | trec07p | 英语 | 垃圾邮件分类 | [2007 TREC Public Spam Corpus](https://plg.uwaterloo.ca/~gvcormac/treccorpus07/); [Spam Track](https://trec.nist.gov/data/spam.html) | ham: 25220; spam: 50199 | 2007 TREC Public Spam Corpus | [trec07p.tar.gz](https://pan.baidu.com/s/1jC9CxVaxwizFCvGtI1JvJA?pwd=g72z) | | trec06c | 汉语 | 垃圾邮件分类 | [2006 TREC Public Spam Corpora](https://plg.uwaterloo.ca/~gvcormac/treccorpus06/); | | 2006 TREC Public Spam Corpora | | | youtube_spam_collection | 英语 | 垃圾评论分类 | [youtube+spam+collection](https://archive.ics.uci.edu/dataset/380/youtube+spam+collection); [YouTube Spam Collection Data Set](https://www.kaggle.com/datasets/lakshmi25npathi/images) | ham: 951; spam: 1005 | 它是为垃圾邮件研究而收集的公共评论集。 | | ### 样本示例 <details> <summary>enron_spam 样本示例</summary> <pre><code>------------ wanted to try ci 4 lis but thought it was way too expensive for you ? <br> viagra at $ 1 . 12 per dose ready to boost your sex life ? positive ? time to do it right now . order viagra at incredibly low prices $ 1 . 12 per dose . unbelivable remove <br> spam ------------ enron / hpl actuals for december 11 , 2000 <br> teco tap 30 . 000 / enron ; 120 . 000 / hpl gas daily ls hpl lsk ic 30 . 000 / enron ham ------------ looking for cheap high - quality software ? rotated napoleonizes <br> water past also , burn , course . gave country , mass lot . act north good . from , learn form most brother vary . when more for . up stick , century put , song be . test , describe , plain , against wood star . began dress ever group . here oh , most world stay . <br> spam ------------ ideabank website <br> please read the attached document for information about an exciting new website for ets employees ! ham ------------ </code></pre> </details> <details> <summary>enron_spam_subset 样本示例</summary> <pre><code>------------ Subject: edrugs online viagra - proven step to start something all over again . nothing is more useful than silence . teachers open the door . you enter by yourself . how sharper than a serpent ' s tooth it isto have a thankless child ! spam ------------ Subject: start date : 12 / 13 / 01 ; hourahead hour : 5 ; start date : 12 / 13 / 01 ; hourahead hour : 5 ; no ancillary schedules awarded . no variances detected . log messages : parsing file - - > > o : \ portland \ westdesk \ california scheduling \ iso final schedules \ 2001121305 . txt ham ------------ Subject: cheapestt medss ! mediccationns at lowesst pricess everyy ! over 80 . % offf , pricess wontt get lowerr we selll vic ' od ( in v , ia . gra x , ana . x http : / / www . pr 3 sdlugs . com / ? refid = 87 <br> spam ------------ Subject: fw : picture > > > the following is an aerial photo of the wtc area . it kinda brings on > vertigo , but is a phenomenal shot . > > http : / / userwww . service . emory . edu / ~ rdgarr / wtc . htm ham ------------ </code></pre> </details> <details> <summary>ling_spam 样本示例</summary> <pre><code>------------ Subject: internet specialist 007 - the spy <br> internet specialist 007 - the spy learn everything about your friends , neighbors , enemies , employees or anyone else ! - - even your boss ! - - even yourself ! this mammoth snoop collection of internet sites will provide you the newest and most current addresses available on the net today . . . = 20 * track down an old debt , or anyone else that has done you wrong ! it 's incredible , and so many new data sites have come online in the past 90 days . . . * over 300 giant resources to look up people , credit , social security , current or past employment , mail order purchases , = 20 addresses , phone numbers , maps to city locations . . . * investigate your family history ! check birth , death , adoption or social security records check service records or army , navy , air force or = 20 marine corps . * locate an old friend ( or an enemy who is hiding ) or a lost = 20 love - - find e-mail , telephone or address information on anyone ! = 20 even look up * unlisted * phone numbers ! * find work by searching classified ads all over the world ! * screen prospective employees - - check credit , driving or criminal records verify income or educational accomplishments = 20 * check out your daughter 's new boyfriend ! * find trial transcripts and court orders ! * enjoy the enchantment of finding out a juicy tid-bit about a co-worker . the internet is a powerful megasource of information , = 20 if you only know where to look . i tell you how to find = 20 out nearly anything about anybody , and tell you exactly where to find it ! you will be amazed to find out what personal information = 20 other people can find out about you ! check your credit = 20 report so you can correct wrong information that may be = 20 used to deny you credit . research yourself first ! you ' ll be horrified , as i was , = 20 at how much data has been accumulated about you . any my huge collection is only the beginning ! once you = 20 locate these free private , college and government web sites , you ' ll find even more links to even more = 20 information search engines ! = 20 if you believe ( like i do ) that the information that is stored about each one of us should be freely accessible , you ' ll want to see the snoop collection i ' ve compiled . verify your own records , or find out what you need to = 20 know about others . i ' m telling you , it 's incredible what you can find out using the internet ! we will accept checks by fax at 813-269 - 9651 or > > > send $ 14 . 95 cash , check or money order to : > > > the coldwell group > > > p . o . box 3787 > > > dept 1007 > > > petersburg , va 23805 i will rush back to you my snoop information for fastest service include your * e-mail * address . = 20 * what information is available - - and exact url to get there ! * exactly where to look for - - and the clever way to use - - = 20 the above search engines , and tons more ! * my easy-to - browse categorized megacenter of information has my own description of how to use each site , and what you ' ll find when you get there - - and tricky tips on how to = 20 extract the best data ! you can know everything about everybody with this internet specialist collection ! * * soon to be available - - the most complete international internet spy = 20 sites available on the web today * * don ' t miss this one or you ' ll be sorry = 20 to be removed from our list please fax your address to 813-269 - 9651 . l = e3 = 01 @ u = 0b <br> spam ------------ Subject: usage - based models - symposium <br> announcing the sixth biennial symposium of the rice university department of linguistics usage-based models of language rice university march 15-18 , 1995 invited speakers : mira ariel tel aviv university joan bybee university of new mexico john du bois university of california , santa barbara michael israel university of california , san diego sydney lamb rice university ronald langacker university of california , san diego tom givon university of oregon brian macwhinney carnegie - mellon university janet pierrehumbert northwestern university john sinclair university of birmingham ( u . k . ) arie verhagen university of utrecht description : the goal of this symposium is to explore approaches to linguistic theory that have in common the aim of accounting for linguistic usage . the empirical data for such theories is not restricted to linguistic intuitions about acceptibility , but comes from usage events of varied types . the focus is on the patterns found in the various sorts of usage data examined , and how those patterns can be extracted , represented , and used by the human mind . research from a variety of traditions will be represented , including corpus-based analyses , discourse studies , experimental studies of language processing and language acquisition , and instrumental phonetics . the approaches taken can be called data-driven , rather than model-driven , in that the fewest possible prior assumptions are made about what types of data are relevant , and that large sets of usage events are observed so that the detailed patterns found in actual usage can emerge . moreover , the various approaches taken show signs of converging toward a view of language as a dynamic system in which linguistic knowledge is not separate from its processing in language use . the linguistic models representing this view are usage-based by virtue of three factors : ( 1 ) the importance placed on usage data for theory construction ; ( 2 ) the direct incorporation of processing ( production and comprehension ) into linguistic theory ; and ( 3 ) the requirement that the models arrived at , whatever the direct source of evidence , must be testable with reference to language use . registration : no charge . symposium attendance on a space-available basis . for further information , contact suzanne kemmer ( kemmer @ ruf . rice . edu ) or michael barlow ( barlow @ ruf . rice . edu ) snailmail : dept . of linguistics , rice university , houston tx 77251-1892 . <br> ham ------------ Subject: domani <br> new improved with free software , free bulk e mail system , free web site = to do what you wish , ongoing support ( optional ) , and a lot more ! all = included . . . . . . . . . . . this is a one time mailing . . . . . . . . . . . . . . . \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ $ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ = \ \ \ \ \ you are about to make at least $ 50 , 000 in less than 90 days read the enclosed program . . . then read it again . . . / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / = / / / / / / / dear friend , the enclosed information is something i almost let slip through my fingers . fortunately , sometime later i re-read everything and gave some thought and study to it . my name is christopher erickson . two years ago , the corporation i worked at for the past twelve years down-sized and my position was eliminated . after unproductive job interviews , i decided to open my own business . over the past year , i incurred many unforeseen financial problems . i owed my family , friends , and creditors over $ 35 , 000 . the economy was taking a toll on my business and i just could n't seem to make ends meet . i had to refinance and borrow against my home to support my family and struggling business . i truly believe it was wrong for me to be in debt like this . at that moment something significant happened in my life and i am writing to share my experience in hopes that this will change your life forever . . . . financially ! ! ! in mid - december , i received this program via email . six months prior to receiving this program i had been sending away for information on various business opportunities . all of the programs i received , in my opinion , were not cost effective . they were either too difficult for me to comprehend or the initial investment was too much for me to risk to see if they worked or not . one claimed i 'd make a million dollars in one year . . . it did n't tell me i 'd have to write a book to make it . but like i was saying , in december of ' 92 i received this program . i did n't send for it , or ask for it , they just got my name off a mailing list . thank goodness for that ! ! ! after reading it several times , to = make sure i was reading it correctly , i could n't believe my eyes . = 20 here was a money-making phenomenon . i could invest as much as i wanted = to start , without putting me further in debt . after i got a pencil and paper and figured it out , i would at least get my money back . after determining that the program is legal and not a chain letter , i decided " why not " . initially i sent out 10 , 000 emails . it only cost me about $ 15 . 00 for my time on-line . the great thing about email is that i did n't need any money for printing to send out the program , only the cost to fulfill my orders . i am telling you like it is , i hope it does n't turn you off , but i promised myself that i would not " rip-off " anyone , no matter how much money it cost me ! . in less than one week , i was starting to receive orders for report # 1 . by january 13th , i had received 26 orders for report # 1 . when you read the guarantee in the program , you will see that " you must receive = 15 to 20 orders for report # 1 within two weeks . if you don ' t , send out = more programs until you do ! " my first step in making $ 50 , 000 in 20 to = 90 days was done . by january 30th , i had received 196 orders for report = # 2 . if you go back to the guarantee , " you must receive 100 or more orders for report # 2 within two weeks . if not , send out more = programs until you do . once you have 100 orders , the rest is easy , = relax , you will make your $ 50 , 000 goal . " well , i had 196 orders for = report # 2 , 96 more than i needed . so i sat back and relaxed . by march = 19th , of my emailing of 10 , 000 , i received $ 58 , 000 with more coming in = every day . i paid off all my debts and bought a much needed new car . please take time to read the attached program , it will change your life forever ! remember , it wont work if you do n't try it . this program does work , but you must follow it exactly ! especially the rules of not trying to place your name in a different place . it does n't work , you ' ll lose out on a lot of money ! report # 2 explains this . = 20 always follow the guarantee , 15 to 20 orders for report # 1 , and 100 or more orders for report # 2 and you will make $ 50 , 000 or more in 20 to 90 days . i am living proof that it works ! ! ! if you choose not to participate in this program , i ' m sorry . it really is a great opportunity with little cost or risk to you . if you choose to participate , follow the program and you will be on your way to financial security . if you are a fellow business owner and you are in financial trouble like i was , or you want to start your own business , consider this a sign . i did ! sincerely , christopher erickson ps do you have any idea what 11 , 700 $ 5 bills ( $ 58 , 000 ) look like piled up on a kitchen table ? it ' s awesome ! " threw it away " " i had received this program before . i threw it away , but later wondered if i should n't have given it a try . of course , i had no idea who to contact to get a copy , so i had to wait until i was emailed another copy of the program . eleven months passed , then it came . i didn ' t throw this one away . i made $ 41 , 000 on the first try . " dawn w . , evansville , in " no free lunch " " my late father always told me , ' remember , alan , there is no free lunch in life . you get out of life what you put into it . ' through trial and error and a somewhat slow frustrating start , i finally figured it out . the program works very well , i just had to find the right target group of people to email it to . so far this year , i have made over $ 63 , 000 using this program . i know my dad would have been very proud of me . " alan b . , philadelphia , pa a personal note from the originator of this program by the time you have read the enclosed information and looked over the enclosed program and reports , you should have concluded that such a program , and one that is legal , could not have been created by an amateur . let me tell you a little about myself . i had a profitable business for ten years . then in 1979 my business began falling off . i was doing the same things that were previously successful for me , but it was n't working . finally , i figured it out . it was n't me , it was the economy . inflation and recession had replaced the stable economy that had been with us since 1945 . i do n't have to tell you what happened to the unemployment rate . . . because many of you know from first hand experience . there were more failures and bankruptcies than ever before . the middle class was vanishing . those who knew what they were doing = invested wisely and moved up . those who did not , including those who = never had anything to save or invest , were moving down into the ranks of = the poor . as the saying goes , " the rich get richer and the poor get = poorer . " the traditional methods of making money will never allow you = to " move up " or " get rich " , inflation will see to that . you have just received information that can give you financial freedom for the rest of your life , with " no risk " and " just a little bit of effort . " you can make more money in the next few months than you have = ever imagined . i should also point out that i will not see a penny of your money , nor anyone else who has provided a testimonial for this program . i have already made over four million dollars ! i have retired from the program after sending out over 16 , 000 programs . now i have several offices which market this and several other programs here in the us and overseas . by the spring , we wish to market the ' internet ' by a partnership with america on line . follow the program exactly as instructed . do not change it in any way . = it works exceedingly well as it is now . remember to email a copy of = this exciting program to everyone that you can think of . one of the people you send this to may send out 50 , 000 . . . and your name will be on every one of them ! . remember though , the more you send out , the = more potential customers you will reach . so my friend , i have given you the ideas , information , materials and opportunity to become financially independent , it is up to you now ! " think about it " before you delete this program from your mailbox , as i almost did , take a little time to read it and really think about it . get a pencil and figure out what could happen when you participate . figure out the worst possible response and no matter how you calculate it , you will still make a lot of money ! definitely get back what you invested . = 20 any doubts you have will vanish when your first orders come in . it works ! paul johnson , raleigh , nc here ' s how this amazing program will make you $ $ $ $ $ $ let 's say that you decide to start small , just to see how it goes , and we ' ll assume you and all those involved send out 2 , 000 programs each . let 's also assume that the mailing receives a . 5 % response . using a good list the response could be much better . also many people will send out hundreds of thousands of programs instead of 2 , 000 . but continuing with this example , you send out only 2 , 000 programs . with a . 5 % response , that is only 10 orders for report # 1 . those 10 people respond by sending out 2 , 000 programs each for a total of 20 , 000 . out of those . 5 % , 100 people respond and order report # 2 . those 100 mail out 2 , 000 programs each for a total of 200 , 000 . the . 5 % response to that is 1 , 000 orders for report # 3 . those 1 , 000 send out 2 , 000 programs each for a 2 , 000 , 000 total . the . 5 % response to that is 10 , 000 orders for report # 4 . that 's 10 , 000 five dollar bills for you . cash ! ! ! ! your total income in this example is $ 50 + $ 500 + $ 5000 + $ 50 , 000 for a total of $ 55 , 550 ! ! ! ! remember friend , this is assuming 1 , 990 out of 2 , 000 people you mail to = will do absolutely nothing . . . and trash this program ! dare to think for = a moment what would happen if everyone or half sent out 100 , 000 programs instead of only 2 , 000 . believe me , many people will do = that and more ! by the way , your cost to participate in this is = practically nothing . you obviously already have an internet connection and email is free ! ! ! report # 3 will show you the best methods for bulk emailing and purchasing email lists . this is a legitimate , legal , money making opportunity . it does not require you to come in contact with people , do any hard work , and best of all , you never have to leave the house except to get the mail . if you believe that someday you ' ll get that big break that you ' ve been waiting for , this is it ! simply follow the instructions , and your dream will come true . this multi-level email order marketing program works perfectly . . . 100 % every time . email is the sales tool of the future . take advantage of this non-commercialized method of advertising now ! ! the longer you wait , the more people will be doing business using email . get your piece of this action ! ! multi-level marketing ( mlm ) has finally gained respectability . it is = being taught in the harvard business school , and both stanford research and the wall street journal have stated that between 50 % and = 65 % of all goods and services will be sold throughout multi - level methods by the mid to late 1990 's . this is a multi - billion dollar industry and of the 500 , 000 millionaires in the us , 20 % ( 100 , 000 ) made their fortune in the last several years in mlm . moreover , statistics show 45 people become millionaires everyday through multi - level marketing . instructions we at erris mail order marketing business , have a method of raising capital that really works 100 % every time . i am sure that you could use = $ 50 , 000 to $ 125 , 000 in the next 20 to 90 days . before you say " bull " , please read the program carefully . this is not a chain letter , but a perfectly legal money making opportunity . basically , this is what we do : as with all multi-level business , we build our business by recruiting new partners and selling our products . every state in the usa allows you to recruit new multi - level business partners , and we offer a product for every dollar sent . your orders come and are filled through the mail , so you are not = involved in personal selling . you do it privately in your own home , = store or office . this is the greatest multi - level mail order marketing anywhere : step ( 1 ) order all four 4 reports listed by name and number . dothis by ordering the report from each of the four 4 names listed on the next page . for each report , send $ 5 cash and a self - addressed , stamped envelope ( business size # 10 ) = to the person listed for the specific report . international = = 20 orders should also include $ 2 extra for postage . it is essential that you specify the name and number of the report requested to the person you are ordering from . you will need all four 4 reports because you will be reprinting and reselling them . do not alter the names or sequence other than what the instructions say . important : always provide same-day service on all orders . step ( 2 ) replace the name and address under report # 1 with yours , moving the one that was there down to report # 2 . drop the name and address under report # 2 to report # 3 , moving the one that was there to report # 4 . the name and address that was under report # 4 is dropped from the list and this party is no doubt on the way to the bank . when doing this , make certain you type the names and addresses accurately ! do not mix up moving product / report positions ! ! ! step ( 3 ) having made the required changes in the name list , save it as a text ( . txt ) file in it 's own directory to be used with whatever email program you like . again , report # 3 will tell you the best methods of bulk emailing and acquiring email lists . step ( 4 ) email a copy of the entire program ( all of this is very important ) to everyone whose address you can get your hands on . start with friends and relatives since you can encourage them to take advantage of this fabulous = 20 money-making opportunity . that 's what i did . and they love me now , more than ever . then , email to anyone and everyone ! use your imagination ! you can get email addresses from companies on the internet who specialize in email mailing lists . these are very cheap , 100 , 000 addresses for around $ 35 . 00 . important : you won't get a good response if you use an old list , so always request a fresh , new list . you will find out where to purchase these lists when you order the four 4 reports . always provide same-day service on all orders ! ! ! required reports * * * order each report by number and name * * * always send a self-addressed , stamped envelope and $ 5 usd cash for each order requesting the specific report by name and number ( international orders should also include $ 2 usd extra for postage ) = 20 add you e amil address when sending in for your report this is for = updated information and continueing support ( optional ) that will be = handed down by you sponcers . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 1 " how to make $ 250 , 000 through multi-level sales " order report # 1 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 2 " major corporations and multi-level sales " j . maz 15774 s . lagrange rd suite # 312 orland pk , il 60462 usa _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ order report # 2 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 3 " sources for the best mailing lists " order report # 3 from : b . thompson 13504 greencaslte ridge tr . 404 burtonsville md . 20866 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 4 " evaluating multi-level sales plans " order report # 4 from : muw # 2 po box 71442 salt lake city , ut 84171-0442 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ conclusion . i am enjoying my fortune that i made by sending out this program . you too , will be making money in 20 to 90 days , if you follow the simple steps outlined in this mailing . to be financially independent is to be free . free to make financial decisions as never before . go into business , get into investments , retire or take a vacation . = 20 = = = = = = 20 <br> spam ------------ Subject: linguistic datasources <br> at the request of subscribers , we ' ve been collecting the addresses of linguistic datasources which can be reached through world wide web . these addresses are now available to any of you who have web access on the linguist web server at the following url : http : / / engserve . tamu . edu / files / linguistics / linguist / datasources . html this file is also available , to those of you who read web - linguist , through the " linguistic datasources " link . we 'd be delighted to hear any comments anyone would care to make . and if there ' re any addresses we need to add , please let us know what they are . we 'd like to emphasize that we 'd be happy to include sites where individual linguists keep data they would like to make available to their colleagues . since the web allows us to share not merely text , but pictures and sound-recordings , we can now begin an interchange of linguistic information that is of a very different nature from that which was possible in the past . anthony & helen <br> ham ------------ </code></pre> </details> <details> <summary>sms_spam 样本示例</summary> <pre><code>------------ She's borderline but yeah whatever. ham ------------ URGENT! You have won a 1 week FREE membership in our £100,000 Prize Jackpot! Txt the word: CLAIM to No: 81010 T&C www.dbuk.net LCCLTD POBOX 4403LDNW1A7RW18 spam ------------ I got a call from a landline number. . . I am asked to come to anna nagar . . . I will go in the afternoon ham ------------ Free Msg: get Gnarls Barkleys "Crazy" ringtone TOTALLY FREE just reply GO to this message right now! spam ------------ I will be gentle princess! We will make sweet gentle love... ham ------------ Refused a loan? Secured or Unsecured? Can't get credit? Call free now 0800 195 6669 or text back 'help' & we will! spam ------------ Sorry, went to bed early, nightnight ham ------------ Free entry in 2 a wkly comp to win FA Cup final tkts 21st May 2005. Text FA to 87121 to receive entry question(std txt rate)T&C's apply 08452810075over18's spam ------------ HMM yeah if your not too grooved out! And im looking forward to my pound special :) ham ------------ FreeMsg Hey there darling it's been 3 week's now and no word back! I'd like some fun you up for it still? Tb ok! XxX std chgs to send, £1.50 to rcv spam ------------ Hi dude hw r u da realy mising u today ham ------------ WINNER!! As a valued network customer you have been selected to receivea £900 prize reward! To claim call 09061701461. Claim code KL341. Valid 12 hours only. spam ------------ </code></pre> </details> <details> <summary>spam_assassin 样本示例</summary> <pre><code>------------ "\n>>>Robert Elz said:\n > Date: Wed, 28 Aug 2002 09:22:34 -0500\n > From: Chris Garrigues <[email protected]\nm>\n > Message-ID: <[email protected]>\n > \n > \n > | so I'll probably poke around at the sequences performance issues,\n > \n > Well, there's this wonderful piece of code in MhSeqExpand ...\n > \n > # Hack to weed out sequence numbers for messages that don't exist\n > foreach m $rseq {\n > if ![file exists $mhProfile(path)/$folder/$m] {\n > Exmh_Debug $mhProfile(path)/$folder/$m not found\n > set ix [lsearch $seq $m]\n > set seq [lreplace $seq $ix $ix]\n > } else {\n > # Real hack\n\nAt least I'm up-front about my hacks :-)\n\n > break\n > }\n > }\n > \n > which is going to run slow if a sequence happens to start with a bunch\n > of messages that don't exist. I'm not sure why it is important that the\n > first message in the sequence returned exists, but not necessarily any\n > of the others, but I'm sure glad it is, as MhSeqExpand gets called lots,\n > and I don't know if I could cope if it were checking every file in the\n > sequences it is looking at, all the time...\n\nThat was my thinking. My recollection about the first message being valid\nis that the ftoc code wants to find that message to start its highlighting,\nfor example, or you are selecting a message to display.\n\n > It may help to keep a list of the valid message numbers for the current\n > folder (though that would then need to be verified against changes to the\n > directory). Does tcl have a directory read function? I assume so...\n\nglob -nocomplain $mhProfile(path)/$folder *\nwill return an unsorted list of the directory's contents.\nBut the thought of keeping an in memory list of valid messages is not fun.\nExmh already maintains in-core lists of messages in sequences, which is\nalready pretty tricky\n\n > Mh_Sequence also goes and rereads the files (.mh_sequences and the\n > context file) but I'm not sure how frequently that one is called.\n\nIn some places I maintain caches of files by checking their modify time,\nbut the sequence files are soo small that by the time you stat them to\ncheck their date stamp, you could just read them again. Also, now that\nwe checkpoint message state on every message view, that file will change\nevery time. In the old days exmh used to cache a bunch of state about\nthe folder.\n\n--\nBrent Welch\nSoftware Architect, Panasas Inc\nPioneering the World's Most Scalable and Agile Storage Network\nwww.panasas.com\[email protected]\n\n\n\n\n_______________________________________________\nExmh-workers mailing list\[email protected]\nhttps://listman.redhat.com/mailman/listinfo/exmh-workers\n\n" spam ------------ "Whiter teeth and a brighter smile are just a click away! \nhttp://www.newnamedns.com/dental/\n\nHave you considered professional teeth whitening? If so, you know it usually costs between $300 and $500 from your local dentist!\n\nVisit our site to learn how to professionally whiten your teeth, using the exact same whitening system your dentist uses, at a fraction of the cost!\n\nWe know our product is the best on the market, and we back it with a 30 day money back guarantee!\n\nClick here to find out more!\nhttp://www.newnamedns.com/dental/\n \n \n \n \n \n---------------------------------------\nTo easily remove your address from the list, go to: \nhttp://www.newnamedns.com/stopthemailplease/\nPlease allow 48-72 hours for removal.\n\n" ham ------------ "> Hi,\n> \n> On Sun, 01 Sep 2002 00:05:03 MDT Reg Clemens wrote: \n> \n> [...]\n> > in messages with GnuPG signatures. But punching the line ALWAYS\n> > gives\n> > \n> > Signature made Thu Aug 29 00:27:17 2002 MDT using DSA key ID BDDF997A\n> > Can't check signature: public key not found\n> > \n> > So, something else is missing.\n> \n> Yes, the public key of the signature you want to check :-).\n> \n> Are you really sure that you have the public key of the message's\n> signature? If not, try downloading it or try to check a signature from\n> which you know you have the public key.\n> \n> \n> \n\nAh, sorry for not making that clearer.\nBut no.\nPreviously (v1.0.6 of GnuPG) there would be a slight pause at this point while\nit went out to get the public key from a keyserver.\nNow, whether I have the key or NOT, I get the failure message.\n\nIts as if it cant find gpg to execute it (but I fixed that path), so there\nmust be something else that I am missing...\n\n\n-- \n Reg.Clemens\n [email protected]\n\n\n\n\n_______________________________________________\nExmh-users mailing list\[email protected]\nhttps://listman.redhat.com/mailman/listinfo/exmh-users\n\n" spam ------------ "Below is the result of your feedback form. It was submitted by\n ([email protected]) on Tuesday, August 27, 2002 at 22:26:56\n---------------------------------------------------------------------------\n\n:: <A HREF=\"http://www.pornhere.net\">click here</A>Don't want to pay for Porn? Would you like to get it for FREE? The Honest, No Risk 100% Free Way? If you take a couple of minutes and read through our simple guide you will be able to get Free Passes to the top Paysites online!<A HREF=\"http://www.pornhere.net\">click here</A>\n\n---------------------------------------------------------------------------\n\n\n" ham ------------ </code></pre> </details> <details> <summary>spam_detection 样本示例</summary> <pre><code>------------ Looking for an epic way to boost your followers? Want to increase your likes? Well, listen up! Our site has the ultimate solution for you! With our automated system, you can get hundreds of followers in seconds! Not only that, but our team of social media experts will also provide you with the latest and greatest tips and tricks to make your page go viral! Don't wait, join the millions of successful users who have already transformed their social media game with our help! <br> spam ------------ I'm working on a bayesian network to genetare synthetic WIM observations such as gross vehicle weight, total vehicle length, individual axle load and inter axle distances. It will be very helpful to have data sets for different sites. Thanks ham ------------ Hey, check out these amazing deals just for you! 🎁🎉 <br> 💰 Get rich quick! Click here to learn how! 💰 <br> 😍 Need a date? We've got you covered! Join our dating site now! 😍 <br> 🔥 Burn fat fast with our miracle pill! Guaranteed results! 🔥 <br> 👀 Want to see what everyone's talking about? Click here for the latest viral videos! 👀 <br> 🤑 Don't miss out on our exclusive discount codes! Shop now and save big! 🤑 <br> 🌈 Feeling lucky? Enter our sweepstakes for spam ------------ Looking for a dataset with the latest Omicron covid variant data and confirmed locations. Any links would be appreciated, thanks! ham ------------ </code></pre> </details> <details> <summary>spam_message 样本示例</summary> <pre><code>------------ 欣彩印刷厂主营彩盒.吊牌.产品说明书及提供四色印刷对外加工.电话xxxxxxxxxxx米先生 spam ------------ 微软称Windows10是“有史以来最安全的Windows版本” ham ------------ 您好!欢迎致电川中意货运有限公司,本公司承接成都至恩施,武汉至恩施及周边县市,覆射全国各地的整车、零担往返运输业务。我们以“信誉第一, spam ------------ 我用了9个月时间学会在医院的时候自己照顾自己这件事情 ham ------------ 在元宵节到来之际 spam ------------ 丰县大队要求各中队加强日常执勤 ham ------------ 欢迎致电开锁王。本公司公安备案,工商注册。经营范围:汽车锁,保险柜,防盗门及各种民用锁芯升级!联系电话xxxx-xxxxxxx spam ------------ 扬州天姿现在基本也留不住高端顾客群体了 ham ------------ </code></pre> </details> <details> <summary>spam_message_lr 样本示例</summary> <pre><code>------------ 3G小贴士提醒您可不要让您的流量白白浪费了哦,快来唤醒吧!与您分享杨子黎2013全新单曲《爱人好累》MV 详情点击:http://yuny.com.cn:3042/tpo/SU/NjiYby spam ------------ 巫倩云:来周总办公室现在 ham ------------ 结婚娶亲本是一件高兴事,新郎却因一辆加长林肯车而当场落泪!这是为什么?详情请点击 http://10006.co/lbJ5 spam ------------ PP提醒你来认证啦!在电脑上登录PP租车官方网站(www.ppzuche.com)或下载PP租车手机客户端(www.ppzuche.com/get-app)上传身份证和驾驶证照片,即可完成租客身份认证。600余款车型,低于市场价30%,随时随地取车,开启便捷用车新时代!【PP租车】 ham ------------ 【联通飞影】对美女自作多情的后果… http://fql.cc/pub/view/iid-48305 spam ------------ 您已成功添加王然(13811083077)为好友,可以接收对方的飞信消息。回复本短信可直接与对方聊天。 ham ------------ 棕盛商业广场一期5月18号火爆认筹,孟庭苇邀您亲见世界500强签约蕲春,VIP卡全城免费热办中。0713-7300000【棕盛商业地产】 spam ------------ 信用卡1707于10月23日16:19消费人民币360.07,记账金额以账单显示为准。加“浦发银行信用卡中心”官方微信查询/待续【浦发银行】 ham ------------ </code></pre> </details> ### 参考来源 <details> <summary>参考的数据来源,展开查看</summary> <pre><code> https://huggingface.co/datasets/dbarbedillo/SMS_Spam_Multilingual_Collection_Dataset https://huggingface.co/datasets/FredZhang7/all-scam-spam https://huggingface.co/datasets/Deysi/spam-detection-dataset https://huggingface.co/datasets/autoevaluate/autoeval-staging-eval-project-sms_spam-216c1ded-12215630 https://huggingface.co/datasets/NotShrirang/email-spam-filter https://huggingface.co/datasets/sms_spam </code></pre> </details>
qgyd2021/spam_detect
[ "license:apache-2.0", "region:us" ]
2023-12-04T08:09:48+00:00
{"license": "apache-2.0"}
2023-12-05T06:45:21+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
垃圾信息检测 ------ 垃圾邮件, 垃圾短信息, 广告识别. ### 数据来源 数据集从网上收集整理如下: ### 样本示例 enron\_spam 样本示例 ``` ------------ wanted to try ci 4 lis but thought it was way too expensive for you ? viagra at $ 1 . 12 per dose ready to boost your sex life ? positive ? time to do it right now . order viagra at incredibly low prices $ 1 . 12 per dose . unbelivable remove spam ------------ enron / hpl actuals for december 11 , 2000 teco tap 30 . 000 / enron ; 120 . 000 / hpl gas daily ls hpl lsk ic 30 . 000 / enron ham ------------ looking for cheap high - quality software ? rotated napoleonizes water past also , burn , course . gave country , mass lot . act north good . from , learn form most brother vary . when more for . up stick , century put , song be . test , describe , plain , against wood star . began dress ever group . here oh , most world stay . spam ------------ ideabank website please read the attached document for information about an exciting new website for ets employees ! ham ------------ ``` enron\_spam\_subset 样本示例 ``` ------------ Subject: edrugs online viagra - proven step to start something all over again . nothing is more useful than silence . teachers open the door . you enter by yourself . how sharper than a serpent ' s tooth it isto have a thankless child ! spam ------------ Subject: start date : 12 / 13 / 01 ; hourahead hour : 5 ; start date : 12 / 13 / 01 ; hourahead hour : 5 ; no ancillary schedules awarded . no variances detected . log messages : parsing file - - > > o : \ portland \ westdesk \ california scheduling \ iso final schedules \ 2001121305 . txt ham ------------ Subject: cheapestt medss ! mediccationns at lowesst pricess everyy ! over 80 . % offf , pricess wontt get lowerr we selll vic ' od ( in v , ia . gra x , ana . x http : / / www . pr 3 sdlugs . com / ? refid = 87 spam ------------ Subject: fw : picture > > > the following is an aerial photo of the wtc area . it kinda brings on > vertigo , but is a phenomenal shot . > > http : / / userwww . service . emory . edu / ~ rdgarr / wtc . htm ham ------------ ``` ling\_spam 样本示例 ``` ------------ Subject: internet specialist 007 - the spy internet specialist 007 - the spy learn everything about your friends , neighbors , enemies , employees or anyone else ! - - even your boss ! - - even yourself ! this mammoth snoop collection of internet sites will provide you the newest and most current addresses available on the net today . . . = 20 * track down an old debt , or anyone else that has done you wrong ! it 's incredible , and so many new data sites have come online in the past 90 days . . . * over 300 giant resources to look up people , credit , social security , current or past employment , mail order purchases , = 20 addresses , phone numbers , maps to city locations . . . * investigate your family history ! check birth , death , adoption or social security records check service records or army , navy , air force or = 20 marine corps . * locate an old friend ( or an enemy who is hiding ) or a lost = 20 love - - find e-mail , telephone or address information on anyone ! = 20 even look up * unlisted * phone numbers ! * find work by searching classified ads all over the world ! * screen prospective employees - - check credit , driving or criminal records verify income or educational accomplishments = 20 * check out your daughter 's new boyfriend ! * find trial transcripts and court orders ! * enjoy the enchantment of finding out a juicy tid-bit about a co-worker . the internet is a powerful megasource of information , = 20 if you only know where to look . i tell you how to find = 20 out nearly anything about anybody , and tell you exactly where to find it ! you will be amazed to find out what personal information = 20 other people can find out about you ! check your credit = 20 report so you can correct wrong information that may be = 20 used to deny you credit . research yourself first ! you ' ll be horrified , as i was , = 20 at how much data has been accumulated about you . any my huge collection is only the beginning ! once you = 20 locate these free private , college and government web sites , you ' ll find even more links to even more = 20 information search engines ! = 20 if you believe ( like i do ) that the information that is stored about each one of us should be freely accessible , you ' ll want to see the snoop collection i ' ve compiled . verify your own records , or find out what you need to = 20 know about others . i ' m telling you , it 's incredible what you can find out using the internet ! we will accept checks by fax at 813-269 - 9651 or > > > send $ 14 . 95 cash , check or money order to : > > > the coldwell group > > > p . o . box 3787 > > > dept 1007 > > > petersburg , va 23805 i will rush back to you my snoop information for fastest service include your * e-mail * address . = 20 * what information is available - - and exact url to get there ! * exactly where to look for - - and the clever way to use - - = 20 the above search engines , and tons more ! * my easy-to - browse categorized megacenter of information has my own description of how to use each site , and what you ' ll find when you get there - - and tricky tips on how to = 20 extract the best data ! you can know everything about everybody with this internet specialist collection ! * * soon to be available - - the most complete international internet spy = 20 sites available on the web today * * don ' t miss this one or you ' ll be sorry = 20 to be removed from our list please fax your address to 813-269 - 9651 . l = e3 = 01 @ u = 0b spam ------------ Subject: usage - based models - symposium announcing the sixth biennial symposium of the rice university department of linguistics usage-based models of language rice university march 15-18 , 1995 invited speakers : mira ariel tel aviv university joan bybee university of new mexico john du bois university of california , santa barbara michael israel university of california , san diego sydney lamb rice university ronald langacker university of california , san diego tom givon university of oregon brian macwhinney carnegie - mellon university janet pierrehumbert northwestern university john sinclair university of birmingham ( u . k . ) arie verhagen university of utrecht description : the goal of this symposium is to explore approaches to linguistic theory that have in common the aim of accounting for linguistic usage . the empirical data for such theories is not restricted to linguistic intuitions about acceptibility , but comes from usage events of varied types . the focus is on the patterns found in the various sorts of usage data examined , and how those patterns can be extracted , represented , and used by the human mind . research from a variety of traditions will be represented , including corpus-based analyses , discourse studies , experimental studies of language processing and language acquisition , and instrumental phonetics . the approaches taken can be called data-driven , rather than model-driven , in that the fewest possible prior assumptions are made about what types of data are relevant , and that large sets of usage events are observed so that the detailed patterns found in actual usage can emerge . moreover , the various approaches taken show signs of converging toward a view of language as a dynamic system in which linguistic knowledge is not separate from its processing in language use . the linguistic models representing this view are usage-based by virtue of three factors : ( 1 ) the importance placed on usage data for theory construction ; ( 2 ) the direct incorporation of processing ( production and comprehension ) into linguistic theory ; and ( 3 ) the requirement that the models arrived at , whatever the direct source of evidence , must be testable with reference to language use . registration : no charge . symposium attendance on a space-available basis . for further information , contact suzanne kemmer ( kemmer @ ruf . rice . edu ) or michael barlow ( barlow @ ruf . rice . edu ) snailmail : dept . of linguistics , rice university , houston tx 77251-1892 . ham ------------ Subject: domani new improved with free software , free bulk e mail system , free web site = to do what you wish , ongoing support ( optional ) , and a lot more ! all = included . . . . . . . . . . . this is a one time mailing . . . . . . . . . . . . . . . \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ $ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ = \ \ \ \ \ you are about to make at least $ 50 , 000 in less than 90 days read the enclosed program . . . then read it again . . . / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / = / / / / / / / dear friend , the enclosed information is something i almost let slip through my fingers . fortunately , sometime later i re-read everything and gave some thought and study to it . my name is christopher erickson . two years ago , the corporation i worked at for the past twelve years down-sized and my position was eliminated . after unproductive job interviews , i decided to open my own business . over the past year , i incurred many unforeseen financial problems . i owed my family , friends , and creditors over $ 35 , 000 . the economy was taking a toll on my business and i just could n't seem to make ends meet . i had to refinance and borrow against my home to support my family and struggling business . i truly believe it was wrong for me to be in debt like this . at that moment something significant happened in my life and i am writing to share my experience in hopes that this will change your life forever . . . . financially ! ! ! in mid - december , i received this program via email . six months prior to receiving this program i had been sending away for information on various business opportunities . all of the programs i received , in my opinion , were not cost effective . they were either too difficult for me to comprehend or the initial investment was too much for me to risk to see if they worked or not . one claimed i 'd make a million dollars in one year . . . it did n't tell me i 'd have to write a book to make it . but like i was saying , in december of ' 92 i received this program . i did n't send for it , or ask for it , they just got my name off a mailing list . thank goodness for that ! ! ! after reading it several times , to = make sure i was reading it correctly , i could n't believe my eyes . = 20 here was a money-making phenomenon . i could invest as much as i wanted = to start , without putting me further in debt . after i got a pencil and paper and figured it out , i would at least get my money back . after determining that the program is legal and not a chain letter , i decided " why not " . initially i sent out 10 , 000 emails . it only cost me about $ 15 . 00 for my time on-line . the great thing about email is that i did n't need any money for printing to send out the program , only the cost to fulfill my orders . i am telling you like it is , i hope it does n't turn you off , but i promised myself that i would not " rip-off " anyone , no matter how much money it cost me ! . in less than one week , i was starting to receive orders for report # 1 . by january 13th , i had received 26 orders for report # 1 . when you read the guarantee in the program , you will see that " you must receive = 15 to 20 orders for report # 1 within two weeks . if you don ' t , send out = more programs until you do ! " my first step in making $ 50 , 000 in 20 to = 90 days was done . by january 30th , i had received 196 orders for report = # 2 . if you go back to the guarantee , " you must receive 100 or more orders for report # 2 within two weeks . if not , send out more = programs until you do . once you have 100 orders , the rest is easy , = relax , you will make your $ 50 , 000 goal . " well , i had 196 orders for = report # 2 , 96 more than i needed . so i sat back and relaxed . by march = 19th , of my emailing of 10 , 000 , i received $ 58 , 000 with more coming in = every day . i paid off all my debts and bought a much needed new car . please take time to read the attached program , it will change your life forever ! remember , it wont work if you do n't try it . this program does work , but you must follow it exactly ! especially the rules of not trying to place your name in a different place . it does n't work , you ' ll lose out on a lot of money ! report # 2 explains this . = 20 always follow the guarantee , 15 to 20 orders for report # 1 , and 100 or more orders for report # 2 and you will make $ 50 , 000 or more in 20 to 90 days . i am living proof that it works ! ! ! if you choose not to participate in this program , i ' m sorry . it really is a great opportunity with little cost or risk to you . if you choose to participate , follow the program and you will be on your way to financial security . if you are a fellow business owner and you are in financial trouble like i was , or you want to start your own business , consider this a sign . i did ! sincerely , christopher erickson ps do you have any idea what 11 , 700 $ 5 bills ( $ 58 , 000 ) look like piled up on a kitchen table ? it ' s awesome ! " threw it away " " i had received this program before . i threw it away , but later wondered if i should n't have given it a try . of course , i had no idea who to contact to get a copy , so i had to wait until i was emailed another copy of the program . eleven months passed , then it came . i didn ' t throw this one away . i made $ 41 , 000 on the first try . " dawn w . , evansville , in " no free lunch " " my late father always told me , ' remember , alan , there is no free lunch in life . you get out of life what you put into it . ' through trial and error and a somewhat slow frustrating start , i finally figured it out . the program works very well , i just had to find the right target group of people to email it to . so far this year , i have made over $ 63 , 000 using this program . i know my dad would have been very proud of me . " alan b . , philadelphia , pa a personal note from the originator of this program by the time you have read the enclosed information and looked over the enclosed program and reports , you should have concluded that such a program , and one that is legal , could not have been created by an amateur . let me tell you a little about myself . i had a profitable business for ten years . then in 1979 my business began falling off . i was doing the same things that were previously successful for me , but it was n't working . finally , i figured it out . it was n't me , it was the economy . inflation and recession had replaced the stable economy that had been with us since 1945 . i do n't have to tell you what happened to the unemployment rate . . . because many of you know from first hand experience . there were more failures and bankruptcies than ever before . the middle class was vanishing . those who knew what they were doing = invested wisely and moved up . those who did not , including those who = never had anything to save or invest , were moving down into the ranks of = the poor . as the saying goes , " the rich get richer and the poor get = poorer . " the traditional methods of making money will never allow you = to " move up " or " get rich " , inflation will see to that . you have just received information that can give you financial freedom for the rest of your life , with " no risk " and " just a little bit of effort . " you can make more money in the next few months than you have = ever imagined . i should also point out that i will not see a penny of your money , nor anyone else who has provided a testimonial for this program . i have already made over four million dollars ! i have retired from the program after sending out over 16 , 000 programs . now i have several offices which market this and several other programs here in the us and overseas . by the spring , we wish to market the ' internet ' by a partnership with america on line . follow the program exactly as instructed . do not change it in any way . = it works exceedingly well as it is now . remember to email a copy of = this exciting program to everyone that you can think of . one of the people you send this to may send out 50 , 000 . . . and your name will be on every one of them ! . remember though , the more you send out , the = more potential customers you will reach . so my friend , i have given you the ideas , information , materials and opportunity to become financially independent , it is up to you now ! " think about it " before you delete this program from your mailbox , as i almost did , take a little time to read it and really think about it . get a pencil and figure out what could happen when you participate . figure out the worst possible response and no matter how you calculate it , you will still make a lot of money ! definitely get back what you invested . = 20 any doubts you have will vanish when your first orders come in . it works ! paul johnson , raleigh , nc here ' s how this amazing program will make you $ $ $ $ $ $ let 's say that you decide to start small , just to see how it goes , and we ' ll assume you and all those involved send out 2 , 000 programs each . let 's also assume that the mailing receives a . 5 % response . using a good list the response could be much better . also many people will send out hundreds of thousands of programs instead of 2 , 000 . but continuing with this example , you send out only 2 , 000 programs . with a . 5 % response , that is only 10 orders for report # 1 . those 10 people respond by sending out 2 , 000 programs each for a total of 20 , 000 . out of those . 5 % , 100 people respond and order report # 2 . those 100 mail out 2 , 000 programs each for a total of 200 , 000 . the . 5 % response to that is 1 , 000 orders for report # 3 . those 1 , 000 send out 2 , 000 programs each for a 2 , 000 , 000 total . the . 5 % response to that is 10 , 000 orders for report # 4 . that 's 10 , 000 five dollar bills for you . cash ! ! ! ! your total income in this example is $ 50 + $ 500 + $ 5000 + $ 50 , 000 for a total of $ 55 , 550 ! ! ! ! remember friend , this is assuming 1 , 990 out of 2 , 000 people you mail to = will do absolutely nothing . . . and trash this program ! dare to think for = a moment what would happen if everyone or half sent out 100 , 000 programs instead of only 2 , 000 . believe me , many people will do = that and more ! by the way , your cost to participate in this is = practically nothing . you obviously already have an internet connection and email is free ! ! ! report # 3 will show you the best methods for bulk emailing and purchasing email lists . this is a legitimate , legal , money making opportunity . it does not require you to come in contact with people , do any hard work , and best of all , you never have to leave the house except to get the mail . if you believe that someday you ' ll get that big break that you ' ve been waiting for , this is it ! simply follow the instructions , and your dream will come true . this multi-level email order marketing program works perfectly . . . 100 % every time . email is the sales tool of the future . take advantage of this non-commercialized method of advertising now ! ! the longer you wait , the more people will be doing business using email . get your piece of this action ! ! multi-level marketing ( mlm ) has finally gained respectability . it is = being taught in the harvard business school , and both stanford research and the wall street journal have stated that between 50 % and = 65 % of all goods and services will be sold throughout multi - level methods by the mid to late 1990 's . this is a multi - billion dollar industry and of the 500 , 000 millionaires in the us , 20 % ( 100 , 000 ) made their fortune in the last several years in mlm . moreover , statistics show 45 people become millionaires everyday through multi - level marketing . instructions we at erris mail order marketing business , have a method of raising capital that really works 100 % every time . i am sure that you could use = $ 50 , 000 to $ 125 , 000 in the next 20 to 90 days . before you say " bull " , please read the program carefully . this is not a chain letter , but a perfectly legal money making opportunity . basically , this is what we do : as with all multi-level business , we build our business by recruiting new partners and selling our products . every state in the usa allows you to recruit new multi - level business partners , and we offer a product for every dollar sent . your orders come and are filled through the mail , so you are not = involved in personal selling . you do it privately in your own home , = store or office . this is the greatest multi - level mail order marketing anywhere : step ( 1 ) order all four 4 reports listed by name and number . dothis by ordering the report from each of the four 4 names listed on the next page . for each report , send $ 5 cash and a self - addressed , stamped envelope ( business size # 10 ) = to the person listed for the specific report . international = = 20 orders should also include $ 2 extra for postage . it is essential that you specify the name and number of the report requested to the person you are ordering from . you will need all four 4 reports because you will be reprinting and reselling them . do not alter the names or sequence other than what the instructions say . important : always provide same-day service on all orders . step ( 2 ) replace the name and address under report # 1 with yours , moving the one that was there down to report # 2 . drop the name and address under report # 2 to report # 3 , moving the one that was there to report # 4 . the name and address that was under report # 4 is dropped from the list and this party is no doubt on the way to the bank . when doing this , make certain you type the names and addresses accurately ! do not mix up moving product / report positions ! ! ! step ( 3 ) having made the required changes in the name list , save it as a text ( . txt ) file in it 's own directory to be used with whatever email program you like . again , report # 3 will tell you the best methods of bulk emailing and acquiring email lists . step ( 4 ) email a copy of the entire program ( all of this is very important ) to everyone whose address you can get your hands on . start with friends and relatives since you can encourage them to take advantage of this fabulous = 20 money-making opportunity . that 's what i did . and they love me now , more than ever . then , email to anyone and everyone ! use your imagination ! you can get email addresses from companies on the internet who specialize in email mailing lists . these are very cheap , 100 , 000 addresses for around $ 35 . 00 . important : you won't get a good response if you use an old list , so always request a fresh , new list . you will find out where to purchase these lists when you order the four 4 reports . always provide same-day service on all orders ! ! ! required reports * * * order each report by number and name * * * always send a self-addressed , stamped envelope and $ 5 usd cash for each order requesting the specific report by name and number ( international orders should also include $ 2 usd extra for postage ) = 20 add you e amil address when sending in for your report this is for = updated information and continueing support ( optional ) that will be = handed down by you sponcers . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 1 " how to make $ 250 , 000 through multi-level sales " order report # 1 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 2 " major corporations and multi-level sales " j . maz 15774 s . lagrange rd suite # 312 orland pk , il 60462 usa _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ order report # 2 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 3 " sources for the best mailing lists " order report # 3 from : b . thompson 13504 greencaslte ridge tr . 404 burtonsville md . 20866 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 4 " evaluating multi-level sales plans " order report # 4 from : muw # 2 po box 71442 salt lake city , ut 84171-0442 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ conclusion . i am enjoying my fortune that i made by sending out this program . you too , will be making money in 20 to 90 days , if you follow the simple steps outlined in this mailing . to be financially independent is to be free . free to make financial decisions as never before . go into business , get into investments , retire or take a vacation . = 20 = = = = = = 20 spam ------------ Subject: linguistic datasources at the request of subscribers , we ' ve been collecting the addresses of linguistic datasources which can be reached through world wide web . these addresses are now available to any of you who have web access on the linguist web server at the following url : http : / / engserve . tamu . edu / files / linguistics / linguist / datasources . html this file is also available , to those of you who read web - linguist , through the " linguistic datasources " link . we 'd be delighted to hear any comments anyone would care to make . and if there ' re any addresses we need to add , please let us know what they are . we 'd like to emphasize that we 'd be happy to include sites where individual linguists keep data they would like to make available to their colleagues . since the web allows us to share not merely text , but pictures and sound-recordings , we can now begin an interchange of linguistic information that is of a very different nature from that which was possible in the past . anthony & helen ham ------------ ``` sms\_spam 样本示例 ``` ------------ She's borderline but yeah whatever. ham ------------ URGENT! You have won a 1 week FREE membership in our £100,000 Prize Jackpot! Txt the word: CLAIM to No: 81010 T&C URL LCCLTD POBOX 4403LDNW1A7RW18 spam ------------ I got a call from a landline number. . . I am asked to come to anna nagar . . . I will go in the afternoon ham ------------ Free Msg: get Gnarls Barkleys "Crazy" ringtone TOTALLY FREE just reply GO to this message right now! spam ------------ I will be gentle princess! We will make sweet gentle love... ham ------------ Refused a loan? Secured or Unsecured? Can't get credit? Call free now 0800 195 6669 or text back 'help' & we will! spam ------------ Sorry, went to bed early, nightnight ham ------------ Free entry in 2 a wkly comp to win FA Cup final tkts 21st May 2005. Text FA to 87121 to receive entry question(std txt rate)T&C's apply 08452810075over18's spam ------------ HMM yeah if your not too grooved out! And im looking forward to my pound special :) ham ------------ FreeMsg Hey there darling it's been 3 week's now and no word back! I'd like some fun you up for it still? Tb ok! XxX std chgs to send, £1.50 to rcv spam ------------ Hi dude hw r u da realy mising u today ham ------------ WINNER!! As a valued network customer you have been selected to receivea £900 prize reward! To claim call 09061701461. Claim code KL341. Valid 12 hours only. spam ------------ ``` spam\_assassin 样本示例 ``` ------------ "\n>>>Robert Elz said:\n > Date: Wed, 28 Aug 2002 09:22:34 -0500\n > From: Chris Garrigues \n > Message-ID: <1030544555.28815.TMDA@URL>\n > \n > \n > | so I'll probably poke around at the sequences performance issues,\n > \n > Well, there's this wonderful piece of code in MhSeqExpand ...\n > \n > # Hack to weed out sequence numbers for messages that don't exist\n > foreach m $rseq {\n > if ![file exists $mhProfile(path)/$folder/$m] {\n > Exmh\_Debug $mhProfile(path)/$folder/$m not found\n > set ix [lsearch $seq $m]\n > set seq [lreplace $seq $ix $ix]\n > } else {\n > # Real hack\n\nAt least I'm up-front about my hacks :-)\n\n > break\n > }\n > }\n > \n > which is going to run slow if a sequence happens to start with a bunch\n > of messages that don't exist. I'm not sure why it is important that the\n > first message in the sequence returned exists, but not necessarily any\n > of the others, but I'm sure glad it is, as MhSeqExpand gets called lots,\n > and I don't know if I could cope if it were checking every file in the\n > sequences it is looking at, all the time...\n\nThat was my thinking. My recollection about the first message being valid\nis that the ftoc code wants to find that message to start its highlighting,\nfor example, or you are selecting a message to display.\n\n > It may help to keep a list of the valid message numbers for the current\n > folder (though that would then need to be verified against changes to the\n > directory). Does tcl have a directory read function? I assume so...\n\nglob -nocomplain $mhProfile(path)/$folder \*\nwill return an unsorted list of the directory's contents.\nBut the thought of keeping an in memory list of valid messages is not fun.\nExmh already maintains in-core lists of messages in sequences, which is\nalready pretty tricky\n\n > Mh\_Sequence also goes and rereads the files (.mh\_sequences and the\n > context file) but I'm not sure how frequently that one is called.\n\nIn some places I maintain caches of files by checking their modify time,\nbut the sequence files are soo small that by the time you stat them to\ncheck their date stamp, you could just read them again. Also, now that\nwe checkpoint message state on every message view, that file will change\nevery time. In the old days exmh used to cache a bunch of state about\nthe folder.\n\n--\nBrent Welch\nSoftware Architect, Panasas Inc\nPioneering the World's Most Scalable and Agile Storage Network\URL\nwelch@URL\n\n\n\n\n\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\nExmh-workers mailing list\nExmh-workers@URL\nhttps://URL spam ------------ "Whiter teeth and a brighter smile are just a click away! \nhttp://URL you considered professional teeth whitening? If so, you know it usually costs between $300 and $500 from your local dentist!\n\nVisit our site to learn how to professionally whiten your teeth, using the exact same whitening system your dentist uses, at a fraction of the cost!\n\nWe know our product is the best on the market, and we back it with a 30 day money back guarantee!\n\nClick here to find out more!\nhttp://URL \n \n \n \n \n---------------------------------------\nTo easily remove your address from the list, go to: \nhttp://URL allow 48-72 hours for removal.\n\n" ham ------------ "> Hi,\n> \n> On Sun, 01 Sep 2002 00:05:03 MDT Reg Clemens wrote: \n> \n> [...]\n> > in messages with GnuPG signatures. But punching the line ALWAYS\n> > gives\n> > \n> > Signature made Thu Aug 29 00:27:17 2002 MDT using DSA key ID BDDF997A\n> > Can't check signature: public key not found\n> > \n> > So, something else is missing.\n> \n> Yes, the public key of the signature you want to check :-).\n> \n> Are you really sure that you have the public key of the message's\n> signature? If not, try downloading it or try to check a signature from\n> which you know you have the public key.\n> \n> \n> \n\nAh, sorry for not making that clearer.\nBut no.\nPreviously (v1.0.6 of GnuPG) there would be a slight pause at this point while\nit went out to get the public key from a keyserver.\nNow, whether I have the key or NOT, I get the failure message.\n\nIts as if it cant find gpg to execute it (but I fixed that path), so there\nmust be something else that I am missing...\n\n\n-- \n Reg.Clemens\n reg@URL\n\n\n\n\n\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\nExmh-users mailing list\nExmh-users@URL\nhttps://URL spam ------------ "Below is the result of your feedback form. It was submitted by\n (jamie@URL) on Tuesday, August 27, 2002 at 22:26:56\n---------------------------------------------------------------------------\n\n:: [click here](\"URL\")Don't want to pay for Porn? Would you like to get it for FREE? The Honest, No Risk 100% Free Way? If you take a couple of minutes and read through our simple guide you will be able to get Free Passes to the top Paysites online![click here](\"URL\")\n\n---------------------------------------------------------------------------\n\n\n" ham ------------ ``` spam\_detection 样本示例 ``` ------------ Looking for an epic way to boost your followers? Want to increase your likes? Well, listen up! Our site has the ultimate solution for you! With our automated system, you can get hundreds of followers in seconds! Not only that, but our team of social media experts will also provide you with the latest and greatest tips and tricks to make your page go viral! Don't wait, join the millions of successful users who have already transformed their social media game with our help! spam ------------ I'm working on a bayesian network to genetare synthetic WIM observations such as gross vehicle weight, total vehicle length, individual axle load and inter axle distances. It will be very helpful to have data sets for different sites. Thanks ham ------------ Hey, check out these amazing deals just for you! Get rich quick! Click here to learn how! Need a date? We've got you covered! Join our dating site now! Burn fat fast with our miracle pill! Guaranteed results! Want to see what everyone's talking about? Click here for the latest viral videos! Don't miss out on our exclusive discount codes! Shop now and save big! Feeling lucky? Enter our sweepstakes for spam ------------ Looking for a dataset with the latest Omicron covid variant data and confirmed locations. Any links would be appreciated, thanks! ham ------------ ``` spam\_message 样本示例 ``` ------------ 欣彩印刷厂主营彩盒.吊牌.产品说明书及提供四色印刷对外加工.电话xxxxxxxxxxx米先生 spam ------------ 微软称Windows10是“有史以来最安全的Windows版本” ham ------------ 您好!欢迎致电川中意货运有限公司,本公司承接成都至恩施,武汉至恩施及周边县市,覆射全国各地的整车、零担往返运输业务。我们以“信誉第一, spam ------------ 我用了9个月时间学会在医院的时候自己照顾自己这件事情 ham ------------ 在元宵节到来之际 spam ------------ 丰县大队要求各中队加强日常执勤 ham ------------ 欢迎致电开锁王。本公司公安备案,工商注册。经营范围:汽车锁,保险柜,防盗门及各种民用锁芯升级!联系电话xxxx-xxxxxxx spam ------------ 扬州天姿现在基本也留不住高端顾客群体了 ham ------------ ``` spam\_message\_lr 样本示例 ``` ------------ 3G小贴士提醒您可不要让您的流量白白浪费了哦,快来唤醒吧!与您分享杨子黎2013全新单曲《爱人好累》MV 详情点击:URL spam ------------ 巫倩云:来周总办公室现在 ham ------------ 结婚娶亲本是一件高兴事,新郎却因一辆加长林肯车而当场落泪!这是为什么?详情请点击 URL spam ------------ PP提醒你来认证啦!在电脑上登录PP租车官方网站(URL)或下载PP租车手机客户端(URL)上传身份证和驾驶证照片,即可完成租客身份认证。600余款车型,低于市场价30%,随时随地取车,开启便捷用车新时代!【PP租车】 ham ------------ 【联通飞影】对美女自作多情的后果… URL spam ------------ 您已成功添加王然(13811083077)为好友,可以接收对方的飞信消息。回复本短信可直接与对方聊天。 ham ------------ 棕盛商业广场一期5月18号火爆认筹,孟庭苇邀您亲见世界500强签约蕲春,VIP卡全城免费热办中。0713-7300000【棕盛商业地产】 spam ------------ 信用卡1707于10月23日16:19消费人民币360.07,记账金额以账单显示为准。加“浦发银行信用卡中心”官方微信查询/待续【浦发银行】 ham ------------ ``` ### 参考来源 参考的数据来源,展开查看 ``` URL URL URL URL URL URL ```
[ "### 数据来源\n\n\n数据集从网上收集整理如下:", "### 样本示例\n\n\n\nenron\\_spam 样本示例\n\n```\n------------\nwanted to try ci 4 lis but thought it was way too expensive for you ?\n \n\nviagra at $ 1 . 12 per dose\nready to boost your sex life ? positive ?\ntime to do it right now . order viagra at incredibly low prices\n$ 1 . 12 per dose . unbelivable\nremove\n \n\nspam\n------------\nenron / hpl actuals for december 11 , 2000\n \n\nteco tap 30 . 000 / enron ; 120 . 000 / hpl gas daily\nls hpl lsk ic 30 . 000 / enron\nham\n------------\nlooking for cheap high - quality software ? rotated napoleonizes\n \n\nwater past also , burn , course . gave country , mass lot . act north\ngood . from , learn form most brother vary . when more for . up\nstick , century put , song be . test , describe , plain , against wood\nstar . began dress ever group . here oh , most world stay .\n \n\nspam\n------------\nideabank website\n \n\nplease read the attached document for information about an exciting new\nwebsite for ets employees !\nham\n------------\n\n```\n\n\n\nenron\\_spam\\_subset 样本示例\n\n```\n------------\nSubject: edrugs online\n viagra - proven step to start something all over again .\n nothing is more useful than silence .\n teachers open the door . you enter by yourself .\n how sharper than a serpent ' s tooth it isto have a thankless child !\nspam\n------------\nSubject: start date : 12 / 13 / 01 ; hourahead hour : 5 ;\n start date : 12 / 13 / 01 ; hourahead hour : 5 ; no ancillary schedules awarded . no variances detected .\n log messages :\n parsing file - - > > o : \\ portland \\ westdesk \\ california scheduling \\ iso final schedules \\ 2001121305 . txt\nham\n------------\nSubject: cheapestt medss !\n mediccationns at lowesst pricess everyy !\n over 80 . % offf , pricess wontt get lowerr\n we selll vic ' od ( in v , ia . gra x , ana . x\n http : / / www . pr 3 sdlugs . com / ? refid = 87\n \n\nspam\n------------\nSubject: fw : picture\n >\n >\n > the following is an aerial photo of the wtc area . it kinda brings on\n > vertigo , but is a phenomenal shot .\n >\n > http : / / userwww . service . emory . edu / ~ rdgarr / wtc . htm\nham\n------------\n\n```\n\n\n\nling\\_spam 样本示例\n\n```\n------------\nSubject: internet specialist 007 - the spy\n \n\ninternet specialist 007 - the spy learn everything about your friends , neighbors , enemies , employees or anyone else ! - - even your boss ! - - even yourself ! this mammoth snoop collection of internet sites will provide you the newest and most current addresses available on the net today . . . = 20 * track down an old debt , or anyone else that has done you wrong ! it 's incredible , and so many new data sites have come online in the past 90 days . . . * over 300 giant resources to look up people , credit , social security , current or past employment , mail order purchases , = 20 addresses , phone numbers , maps to city locations . . . * investigate your family history ! check birth , death , adoption or social security records check service records or army , navy , air force or = 20 marine corps . * locate an old friend ( or an enemy who is hiding ) or a lost = 20 love - - find e-mail , telephone or address information on anyone ! = 20 even look up * unlisted * phone numbers ! * find work by searching classified ads all over the world ! * screen prospective employees - - check credit , driving or criminal records verify income or educational accomplishments = 20 * check out your daughter 's new boyfriend ! * find trial transcripts and court orders ! * enjoy the enchantment of finding out a juicy tid-bit about a co-worker . the internet is a powerful megasource of information , = 20 if you only know where to look . i tell you how to find = 20 out nearly anything about anybody , and tell you exactly where to find it ! you will be amazed to find out what personal information = 20 other people can find out about you ! check your credit = 20 report so you can correct wrong information that may be = 20 used to deny you credit . research yourself first ! you ' ll be horrified , as i was , = 20 at how much data has been accumulated about you . any my huge collection is only the beginning ! once you = 20 locate these free private , college and government web sites , you ' ll find even more links to even more = 20 information search engines ! = 20 if you believe ( like i do ) that the information that is stored about each one of us should be freely accessible , you ' ll want to see the snoop collection i ' ve compiled . verify your own records , or find out what you need to = 20 know about others . i ' m telling you , it 's incredible what you can find out using the internet ! we will accept checks by fax at 813-269 - 9651 or > > > send $ 14 . 95 cash , check or money order to : > > > the coldwell group > > > p . o . box 3787 > > > dept 1007 > > > petersburg , va 23805 i will rush back to you my snoop information for fastest service include your * e-mail * address . = 20 * what information is available - - and exact url to get there ! * exactly where to look for - - and the clever way to use - - = 20 the above search engines , and tons more ! * my easy-to - browse categorized megacenter of information has my own description of how to use each site , and what you ' ll find when you get there - - and tricky tips on how to = 20 extract the best data ! you can know everything about everybody with this internet specialist collection ! * * soon to be available - - the most complete international internet spy = 20 sites available on the web today * * don ' t miss this one or you ' ll be sorry = 20 to be removed from our list please fax your address to 813-269 - 9651 . l = e3 = 01 @ u = 0b\n \n\nspam\n------------\nSubject: usage - based models - symposium\n \n\nannouncing the sixth biennial symposium of the rice university department of linguistics usage-based models of language rice university march 15-18 , 1995 invited speakers : mira ariel tel aviv university joan bybee university of new mexico john du bois university of california , santa barbara michael israel university of california , san diego sydney lamb rice university ronald langacker university of california , san diego tom givon university of oregon brian macwhinney carnegie - mellon university janet pierrehumbert northwestern university john sinclair university of birmingham ( u . k . ) arie verhagen university of utrecht description : the goal of this symposium is to explore approaches to linguistic theory that have in common the aim of accounting for linguistic usage . the empirical data for such theories is not restricted to linguistic intuitions about acceptibility , but comes from usage events of varied types . the focus is on the patterns found in the various sorts of usage data examined , and how those patterns can be extracted , represented , and used by the human mind . research from a variety of traditions will be represented , including corpus-based analyses , discourse studies , experimental studies of language processing and language acquisition , and instrumental phonetics . the approaches taken can be called data-driven , rather than model-driven , in that the fewest possible prior assumptions are made about what types of data are relevant , and that large sets of usage events are observed so that the detailed patterns found in actual usage can emerge . moreover , the various approaches taken show signs of converging toward a view of language as a dynamic system in which linguistic knowledge is not separate from its processing in language use . the linguistic models representing this view are usage-based by virtue of three factors : ( 1 ) the importance placed on usage data for theory construction ; ( 2 ) the direct incorporation of processing ( production and comprehension ) into linguistic theory ; and ( 3 ) the requirement that the models arrived at , whatever the direct source of evidence , must be testable with reference to language use . registration : no charge . symposium attendance on a space-available basis . for further information , contact suzanne kemmer ( kemmer @ ruf . rice . edu ) or michael barlow ( barlow @ ruf . rice . edu ) snailmail : dept . of linguistics , rice university , houston tx 77251-1892 .\n \n\nham\n------------\nSubject: domani\n \n\nnew improved with free software , free bulk e mail system , free web site = to do what you wish , ongoing support ( optional ) , and a lot more ! all = included . . . . . . . . . . . this is a one time mailing . . . . . . . . . . . . . . . \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ $ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ = \\ \\ \\ \\ \\ you are about to make at least $ 50 , 000 in less than 90 days read the enclosed program . . . then read it again . . . / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / = / / / / / / / dear friend , the enclosed information is something i almost let slip through my fingers . fortunately , sometime later i re-read everything and gave some thought and study to it . my name is christopher erickson . two years ago , the corporation i worked at for the past twelve years down-sized and my position was eliminated . after unproductive job interviews , i decided to open my own business . over the past year , i incurred many unforeseen financial problems . i owed my family , friends , and creditors over $ 35 , 000 . the economy was taking a toll on my business and i just could n't seem to make ends meet . i had to refinance and borrow against my home to support my family and struggling business . i truly believe it was wrong for me to be in debt like this . at that moment something significant happened in my life and i am writing to share my experience in hopes that this will change your life forever . . . . financially ! ! ! in mid - december , i received this program via email . six months prior to receiving this program i had been sending away for information on various business opportunities . all of the programs i received , in my opinion , were not cost effective . they were either too difficult for me to comprehend or the initial investment was too much for me to risk to see if they worked or not . one claimed i 'd make a million dollars in one year . . . it did n't tell me i 'd have to write a book to make it . but like i was saying , in december of ' 92 i received this program . i did n't send for it , or ask for it , they just got my name off a mailing list . thank goodness for that ! ! ! after reading it several times , to = make sure i was reading it correctly , i could n't believe my eyes . = 20 here was a money-making phenomenon . i could invest as much as i wanted = to start , without putting me further in debt . after i got a pencil and paper and figured it out , i would at least get my money back . after determining that the program is legal and not a chain letter , i decided \" why not \" . initially i sent out 10 , 000 emails . it only cost me about $ 15 . 00 for my time on-line . the great thing about email is that i did n't need any money for printing to send out the program , only the cost to fulfill my orders . i am telling you like it is , i hope it does n't turn you off , but i promised myself that i would not \" rip-off \" anyone , no matter how much money it cost me ! . in less than one week , i was starting to receive orders for report # 1 . by january 13th , i had received 26 orders for report # 1 . when you read the guarantee in the program , you will see that \" you must receive = 15 to 20 orders for report # 1 within two weeks . if you don ' t , send out = more programs until you do ! \" my first step in making $ 50 , 000 in 20 to = 90 days was done . by january 30th , i had received 196 orders for report = # 2 . if you go back to the guarantee , \" you must receive 100 or more orders for report # 2 within two weeks . if not , send out more = programs until you do . once you have 100 orders , the rest is easy , = relax , you will make your $ 50 , 000 goal . \" well , i had 196 orders for = report # 2 , 96 more than i needed . so i sat back and relaxed . by march = 19th , of my emailing of 10 , 000 , i received $ 58 , 000 with more coming in = every day . i paid off all my debts and bought a much needed new car . please take time to read the attached program , it will change your life forever ! remember , it wont work if you do n't try it . this program does work , but you must follow it exactly ! especially the rules of not trying to place your name in a different place . it does n't work , you ' ll lose out on a lot of money ! report # 2 explains this . = 20 always follow the guarantee , 15 to 20 orders for report # 1 , and 100 or more orders for report # 2 and you will make $ 50 , 000 or more in 20 to 90 days . i am living proof that it works ! ! ! if you choose not to participate in this program , i ' m sorry . it really is a great opportunity with little cost or risk to you . if you choose to participate , follow the program and you will be on your way to financial security . if you are a fellow business owner and you are in financial trouble like i was , or you want to start your own business , consider this a sign . i did ! sincerely , christopher erickson ps do you have any idea what 11 , 700 $ 5 bills ( $ 58 , 000 ) look like piled up on a kitchen table ? it ' s awesome ! \" threw it away \" \" i had received this program before . i threw it away , but later wondered if i should n't have given it a try . of course , i had no idea who to contact to get a copy , so i had to wait until i was emailed another copy of the program . eleven months passed , then it came . i didn ' t throw this one away . i made $ 41 , 000 on the first try . \" dawn w . , evansville , in \" no free lunch \" \" my late father always told me , ' remember , alan , there is no free lunch in life . you get out of life what you put into it . ' through trial and error and a somewhat slow frustrating start , i finally figured it out . the program works very well , i just had to find the right target group of people to email it to . so far this year , i have made over $ 63 , 000 using this program . i know my dad would have been very proud of me . \" alan b . , philadelphia , pa a personal note from the originator of this program by the time you have read the enclosed information and looked over the enclosed program and reports , you should have concluded that such a program , and one that is legal , could not have been created by an amateur . let me tell you a little about myself . i had a profitable business for ten years . then in 1979 my business began falling off . i was doing the same things that were previously successful for me , but it was n't working . finally , i figured it out . it was n't me , it was the economy . inflation and recession had replaced the stable economy that had been with us since 1945 . i do n't have to tell you what happened to the unemployment rate . . . because many of you know from first hand experience . there were more failures and bankruptcies than ever before . the middle class was vanishing . those who knew what they were doing = invested wisely and moved up . those who did not , including those who = never had anything to save or invest , were moving down into the ranks of = the poor . as the saying goes , \" the rich get richer and the poor get = poorer . \" the traditional methods of making money will never allow you = to \" move up \" or \" get rich \" , inflation will see to that . you have just received information that can give you financial freedom for the rest of your life , with \" no risk \" and \" just a little bit of effort . \" you can make more money in the next few months than you have = ever imagined . i should also point out that i will not see a penny of your money , nor anyone else who has provided a testimonial for this program . i have already made over four million dollars ! i have retired from the program after sending out over 16 , 000 programs . now i have several offices which market this and several other programs here in the us and overseas . by the spring , we wish to market the ' internet ' by a partnership with america on line . follow the program exactly as instructed . do not change it in any way . = it works exceedingly well as it is now . remember to email a copy of = this exciting program to everyone that you can think of . one of the people you send this to may send out 50 , 000 . . . and your name will be on every one of them ! . remember though , the more you send out , the = more potential customers you will reach . so my friend , i have given you the ideas , information , materials and opportunity to become financially independent , it is up to you now ! \" think about it \" before you delete this program from your mailbox , as i almost did , take a little time to read it and really think about it . get a pencil and figure out what could happen when you participate . figure out the worst possible response and no matter how you calculate it , you will still make a lot of money ! definitely get back what you invested . = 20 any doubts you have will vanish when your first orders come in . it works ! paul johnson , raleigh , nc here ' s how this amazing program will make you $ $ $ $ $ $ let 's say that you decide to start small , just to see how it goes , and we ' ll assume you and all those involved send out 2 , 000 programs each . let 's also assume that the mailing receives a . 5 % response . using a good list the response could be much better . also many people will send out hundreds of thousands of programs instead of 2 , 000 . but continuing with this example , you send out only 2 , 000 programs . with a . 5 % response , that is only 10 orders for report # 1 . those 10 people respond by sending out 2 , 000 programs each for a total of 20 , 000 . out of those . 5 % , 100 people respond and order report # 2 . those 100 mail out 2 , 000 programs each for a total of 200 , 000 . the . 5 % response to that is 1 , 000 orders for report # 3 . those 1 , 000 send out 2 , 000 programs each for a 2 , 000 , 000 total . the . 5 % response to that is 10 , 000 orders for report # 4 . that 's 10 , 000 five dollar bills for you . cash ! ! ! ! your total income in this example is $ 50 + $ 500 + $ 5000 + $ 50 , 000 for a total of $ 55 , 550 ! ! ! ! remember friend , this is assuming 1 , 990 out of 2 , 000 people you mail to = will do absolutely nothing . . . and trash this program ! dare to think for = a moment what would happen if everyone or half sent out 100 , 000 programs instead of only 2 , 000 . believe me , many people will do = that and more ! by the way , your cost to participate in this is = practically nothing . you obviously already have an internet connection and email is free ! ! ! report # 3 will show you the best methods for bulk emailing and purchasing email lists . this is a legitimate , legal , money making opportunity . it does not require you to come in contact with people , do any hard work , and best of all , you never have to leave the house except to get the mail . if you believe that someday you ' ll get that big break that you ' ve been waiting for , this is it ! simply follow the instructions , and your dream will come true . this multi-level email order marketing program works perfectly . . . 100 % every time . email is the sales tool of the future . take advantage of this non-commercialized method of advertising now ! ! the longer you wait , the more people will be doing business using email . get your piece of this action ! ! multi-level marketing ( mlm ) has finally gained respectability . it is = being taught in the harvard business school , and both stanford research and the wall street journal have stated that between 50 % and = 65 % of all goods and services will be sold throughout multi - level methods by the mid to late 1990 's . this is a multi - billion dollar industry and of the 500 , 000 millionaires in the us , 20 % ( 100 , 000 ) made their fortune in the last several years in mlm . moreover , statistics show 45 people become millionaires everyday through multi - level marketing . instructions we at erris mail order marketing business , have a method of raising capital that really works 100 % every time . i am sure that you could use = $ 50 , 000 to $ 125 , 000 in the next 20 to 90 days . before you say \" bull \" , please read the program carefully . this is not a chain letter , but a perfectly legal money making opportunity . basically , this is what we do : as with all multi-level business , we build our business by recruiting new partners and selling our products . every state in the usa allows you to recruit new multi - level business partners , and we offer a product for every dollar sent . your orders come and are filled through the mail , so you are not = involved in personal selling . you do it privately in your own home , = store or office . this is the greatest multi - level mail order marketing anywhere : step ( 1 ) order all four 4 reports listed by name and number . dothis by ordering the report from each of the four 4 names listed on the next page . for each report , send $ 5 cash and a self - addressed , stamped envelope ( business size # 10 ) = to the person listed for the specific report . international = = 20 orders should also include $ 2 extra for postage . it is essential that you specify the name and number of the report requested to the person you are ordering from . you will need all four 4 reports because you will be reprinting and reselling them . do not alter the names or sequence other than what the instructions say . important : always provide same-day service on all orders . step ( 2 ) replace the name and address under report # 1 with yours , moving the one that was there down to report # 2 . drop the name and address under report # 2 to report # 3 , moving the one that was there to report # 4 . the name and address that was under report # 4 is dropped from the list and this party is no doubt on the way to the bank . when doing this , make certain you type the names and addresses accurately ! do not mix up moving product / report positions ! ! ! step ( 3 ) having made the required changes in the name list , save it as a text ( . txt ) file in it 's own directory to be used with whatever email program you like . again , report # 3 will tell you the best methods of bulk emailing and acquiring email lists . step ( 4 ) email a copy of the entire program ( all of this is very important ) to everyone whose address you can get your hands on . start with friends and relatives since you can encourage them to take advantage of this fabulous = 20 money-making opportunity . that 's what i did . and they love me now , more than ever . then , email to anyone and everyone ! use your imagination ! you can get email addresses from companies on the internet who specialize in email mailing lists . these are very cheap , 100 , 000 addresses for around $ 35 . 00 . important : you won't get a good response if you use an old list , so always request a fresh , new list . you will find out where to purchase these lists when you order the four 4 reports . always provide same-day service on all orders ! ! ! required reports * * * order each report by number and name * * * always send a self-addressed , stamped envelope and $ 5 usd cash for each order requesting the specific report by name and number ( international orders should also include $ 2 usd extra for postage ) = 20 add you e amil address when sending in for your report this is for = updated information and continueing support ( optional ) that will be = handed down by you sponcers . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 1 \" how to make $ 250 , 000 through multi-level sales \" order report # 1 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 2 \" major corporations and multi-level sales \" j . maz 15774 s . lagrange rd suite # 312 orland pk , il 60462 usa _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ order report # 2 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 3 \" sources for the best mailing lists \" order report # 3 from : b . thompson 13504 greencaslte ridge tr . 404 burtonsville md . 20866 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 4 \" evaluating multi-level sales plans \" order report # 4 from : muw # 2 po box 71442 salt lake city , ut 84171-0442 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ conclusion . i am enjoying my fortune that i made by sending out this program . you too , will be making money in 20 to 90 days , if you follow the simple steps outlined in this mailing . to be financially independent is to be free . free to make financial decisions as never before . go into business , get into investments , retire or take a vacation . = 20 = = = = = = 20\n \n\nspam\n------------\nSubject: linguistic datasources\n \n\nat the request of subscribers , we ' ve been collecting the addresses of linguistic datasources which can be reached through world wide web . these addresses are now available to any of you who have web access on the linguist web server at the following url : http : / / engserve . tamu . edu / files / linguistics / linguist / datasources . html this file is also available , to those of you who read web - linguist , through the \" linguistic datasources \" link . we 'd be delighted to hear any comments anyone would care to make . and if there ' re any addresses we need to add , please let us know what they are . we 'd like to emphasize that we 'd be happy to include sites where individual linguists keep data they would like to make available to their colleagues . since the web allows us to share not merely text , but pictures and sound-recordings , we can now begin an interchange of linguistic information that is of a very different nature from that which was possible in the past . anthony & helen\n \n\nham\n------------\n\n```\n\n\n\nsms\\_spam 样本示例\n\n```\n------------\nShe's borderline but yeah whatever.\nham\n------------\nURGENT! You have won a 1 week FREE membership in our £100,000 Prize Jackpot! Txt the word: CLAIM to No: 81010 T&C URL LCCLTD POBOX 4403LDNW1A7RW18\nspam\n------------\nI got a call from a landline number. . . I am asked to come to anna nagar . . . I will go in the afternoon\nham\n------------\nFree Msg: get Gnarls Barkleys \"Crazy\" ringtone TOTALLY FREE just reply GO to this message right now!\nspam\n------------\nI will be gentle princess! We will make sweet gentle love...\nham\n------------\nRefused a loan? Secured or Unsecured? Can't get credit? Call free now 0800 195 6669 or text back 'help' & we will!\nspam\n------------\nSorry, went to bed early, nightnight\nham\n------------\nFree entry in 2 a wkly comp to win FA Cup final tkts 21st May 2005. Text FA to 87121 to receive entry question(std txt rate)T&C's apply 08452810075over18's\nspam\n------------\nHMM yeah if your not too grooved out! And im looking forward to my pound special :)\nham\n------------\nFreeMsg Hey there darling it's been 3 week's now and no word back! I'd like some fun you up for it still? Tb ok! XxX std chgs to send, £1.50 to rcv\nspam\n------------\nHi dude hw r u da realy mising u today\nham\n------------\nWINNER!! As a valued network customer you have been selected to receivea £900 prize reward! To claim call 09061701461. Claim code KL341. Valid 12 hours only.\nspam\n------------\n\n```\n\n\n\nspam\\_assassin 样本示例\n\n```\n------------\n\"\\n>>>Robert Elz said:\\n > Date: Wed, 28 Aug 2002 09:22:34 -0500\\n > From: Chris Garrigues \\n > Message-ID: <1030544555.28815.TMDA@URL>\\n > \\n > \\n > | so I'll probably poke around at the sequences performance issues,\\n > \\n > Well, there's this wonderful piece of code in MhSeqExpand ...\\n > \\n > # Hack to weed out sequence numbers for messages that don't exist\\n > foreach m $rseq {\\n > if ![file exists $mhProfile(path)/$folder/$m] {\\n > Exmh\\_Debug $mhProfile(path)/$folder/$m not found\\n > set ix [lsearch $seq $m]\\n > set seq [lreplace $seq $ix $ix]\\n > } else {\\n > # Real hack\\n\\nAt least I'm up-front about my hacks :-)\\n\\n > break\\n > }\\n > }\\n > \\n > which is going to run slow if a sequence happens to start with a bunch\\n > of messages that don't exist. I'm not sure why it is important that the\\n > first message in the sequence returned exists, but not necessarily any\\n > of the others, but I'm sure glad it is, as MhSeqExpand gets called lots,\\n > and I don't know if I could cope if it were checking every file in the\\n > sequences it is looking at, all the time...\\n\\nThat was my thinking. My recollection about the first message being valid\\nis that the ftoc code wants to find that message to start its highlighting,\\nfor example, or you are selecting a message to display.\\n\\n > It may help to keep a list of the valid message numbers for the current\\n > folder (though that would then need to be verified against changes to the\\n > directory). Does tcl have a directory read function? I assume so...\\n\\nglob -nocomplain $mhProfile(path)/$folder \\*\\nwill return an unsorted list of the directory's contents.\\nBut the thought of keeping an in memory list of valid messages is not fun.\\nExmh already maintains in-core lists of messages in sequences, which is\\nalready pretty tricky\\n\\n > Mh\\_Sequence also goes and rereads the files (.mh\\_sequences and the\\n > context file) but I'm not sure how frequently that one is called.\\n\\nIn some places I maintain caches of files by checking their modify time,\\nbut the sequence files are soo small that by the time you stat them to\\ncheck their date stamp, you could just read them again. Also, now that\\nwe checkpoint message state on every message view, that file will change\\nevery time. In the old days exmh used to cache a bunch of state about\\nthe folder.\\n\\n--\\nBrent Welch\\nSoftware Architect, Panasas Inc\\nPioneering the World's Most Scalable and Agile Storage Network\\URL\\nwelch@URL\\n\\n\\n\\n\\n\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\nExmh-workers mailing list\\nExmh-workers@URL\\nhttps://URL\nspam\n------------\n\"Whiter teeth and a brighter smile are just a click away! \\nhttp://URL you considered professional teeth whitening? If so, you know it usually costs between $300 and $500 from your local dentist!\\n\\nVisit our site to learn how to professionally whiten your teeth, using the exact same whitening system your dentist uses, at a fraction of the cost!\\n\\nWe know our product is the best on the market, and we back it with a 30 day money back guarantee!\\n\\nClick here to find out more!\\nhttp://URL \\n \\n \\n \\n \\n---------------------------------------\\nTo easily remove your address from the list, go to: \\nhttp://URL allow 48-72 hours for removal.\\n\\n\"\nham\n------------\n\"> Hi,\\n> \\n> On Sun, 01 Sep 2002 00:05:03 MDT Reg Clemens wrote: \\n> \\n> [...]\\n> > in messages with GnuPG signatures. But punching the line ALWAYS\\n> > gives\\n> > \\n> > Signature made Thu Aug 29 00:27:17 2002 MDT using DSA key ID BDDF997A\\n> > Can't check signature: public key not found\\n> > \\n> > So, something else is missing.\\n> \\n> Yes, the public key of the signature you want to check :-).\\n> \\n> Are you really sure that you have the public key of the message's\\n> signature? If not, try downloading it or try to check a signature from\\n> which you know you have the public key.\\n> \\n> \\n> \\n\\nAh, sorry for not making that clearer.\\nBut no.\\nPreviously (v1.0.6 of GnuPG) there would be a slight pause at this point while\\nit went out to get the public key from a keyserver.\\nNow, whether I have the key or NOT, I get the failure message.\\n\\nIts as if it cant find gpg to execute it (but I fixed that path), so there\\nmust be something else that I am missing...\\n\\n\\n-- \\n Reg.Clemens\\n reg@URL\\n\\n\\n\\n\\n\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\nExmh-users mailing list\\nExmh-users@URL\\nhttps://URL\nspam\n------------\n\"Below is the result of your feedback form. It was submitted by\\n (jamie@URL) on Tuesday, August 27, 2002 at 22:26:56\\n---------------------------------------------------------------------------\\n\\n:: [click here](\\\"URL\\\")Don't want to pay for Porn? Would you like to get it for FREE? The Honest, No Risk 100% Free Way? If you take a couple of minutes and read through our simple guide you will be able to get Free Passes to the top Paysites online![click here](\\\"URL\\\")\\n\\n---------------------------------------------------------------------------\\n\\n\\n\"\nham\n------------\n\n```\n\n\n\nspam\\_detection 样本示例\n\n```\n------------\nLooking for an epic way to boost your followers? Want to increase your likes? Well, listen up! Our site has the ultimate solution for you! With our automated system, you can get hundreds of followers in seconds! Not only that, but our team of social media experts will also provide you with the latest and greatest tips and tricks to make your page go viral! Don't wait, join the millions of successful users who have already transformed their social media game with our help! \n \n\nspam\n------------\nI'm working on a bayesian network to genetare synthetic WIM observations such as gross vehicle weight, total vehicle length, individual axle load and inter axle distances. It will be very helpful to have data sets for different sites. Thanks\nham\n------------\nHey, check out these amazing deals just for you! \n \n\n Get rich quick! Click here to learn how! \n \n\n Need a date? We've got you covered! Join our dating site now! \n \n\n Burn fat fast with our miracle pill! Guaranteed results! \n \n\n Want to see what everyone's talking about? Click here for the latest viral videos! \n \n\n Don't miss out on our exclusive discount codes! Shop now and save big! \n \n\n Feeling lucky? Enter our sweepstakes for\nspam\n------------\nLooking for a dataset with the latest Omicron covid variant data and confirmed locations. Any links would be appreciated, thanks!\nham\n------------\n\n```\n\n\n\nspam\\_message 样本示例\n\n```\n------------\n欣彩印刷厂主营彩盒.吊牌.产品说明书及提供四色印刷对外加工.电话xxxxxxxxxxx米先生\nspam\n------------\n微软称Windows10是“有史以来最安全的Windows版本”\nham\n------------\n您好!欢迎致电川中意货运有限公司,本公司承接成都至恩施,武汉至恩施及周边县市,覆射全国各地的整车、零担往返运输业务。我们以“信誉第一,\nspam\n------------\n我用了9个月时间学会在医院的时候自己照顾自己这件事情\nham\n------------\n在元宵节到来之际\nspam\n------------\n丰县大队要求各中队加强日常执勤\nham\n------------\n欢迎致电开锁王。本公司公安备案,工商注册。经营范围:汽车锁,保险柜,防盗门及各种民用锁芯升级!联系电话xxxx-xxxxxxx\nspam\n------------\n扬州天姿现在基本也留不住高端顾客群体了\nham\n------------\n\n```\n\n\n\nspam\\_message\\_lr 样本示例\n\n```\n------------\n3G小贴士提醒您可不要让您的流量白白浪费了哦,快来唤醒吧!与您分享杨子黎2013全新单曲《爱人好累》MV 详情点击:URL\nspam\n------------\n巫倩云:来周总办公室现在\nham\n------------\n结婚娶亲本是一件高兴事,新郎却因一辆加长林肯车而当场落泪!这是为什么?详情请点击 URL\nspam\n------------\nPP提醒你来认证啦!在电脑上登录PP租车官方网站(URL)或下载PP租车手机客户端(URL)上传身份证和驾驶证照片,即可完成租客身份认证。600余款车型,低于市场价30%,随时随地取车,开启便捷用车新时代!【PP租车】\nham\n------------\n【联通飞影】对美女自作多情的后果… URL\nspam\n------------\n您已成功添加王然(13811083077)为好友,可以接收对方的飞信消息。回复本短信可直接与对方聊天。\nham\n------------\n棕盛商业广场一期5月18号火爆认筹,孟庭苇邀您亲见世界500强签约蕲春,VIP卡全城免费热办中。0713-7300000【棕盛商业地产】\nspam\n------------\n信用卡1707于10月23日16:19消费人民币360.07,记账金额以账单显示为准。加“浦发银行信用卡中心”官方微信查询/待续【浦发银行】\nham\n------------\n\n```", "### 参考来源\n\n\n\n参考的数据来源,展开查看\n\n```\n\nURL\nURL\nURL\nURL\nURL\nURL\n\n\n\n```" ]
[ "TAGS\n#license-apache-2.0 #region-us \n", "### 数据来源\n\n\n数据集从网上收集整理如下:", "### 样本示例\n\n\n\nenron\\_spam 样本示例\n\n```\n------------\nwanted to try ci 4 lis but thought it was way too expensive for you ?\n \n\nviagra at $ 1 . 12 per dose\nready to boost your sex life ? positive ?\ntime to do it right now . order viagra at incredibly low prices\n$ 1 . 12 per dose . unbelivable\nremove\n \n\nspam\n------------\nenron / hpl actuals for december 11 , 2000\n \n\nteco tap 30 . 000 / enron ; 120 . 000 / hpl gas daily\nls hpl lsk ic 30 . 000 / enron\nham\n------------\nlooking for cheap high - quality software ? rotated napoleonizes\n \n\nwater past also , burn , course . gave country , mass lot . act north\ngood . from , learn form most brother vary . when more for . up\nstick , century put , song be . test , describe , plain , against wood\nstar . began dress ever group . here oh , most world stay .\n \n\nspam\n------------\nideabank website\n \n\nplease read the attached document for information about an exciting new\nwebsite for ets employees !\nham\n------------\n\n```\n\n\n\nenron\\_spam\\_subset 样本示例\n\n```\n------------\nSubject: edrugs online\n viagra - proven step to start something all over again .\n nothing is more useful than silence .\n teachers open the door . you enter by yourself .\n how sharper than a serpent ' s tooth it isto have a thankless child !\nspam\n------------\nSubject: start date : 12 / 13 / 01 ; hourahead hour : 5 ;\n start date : 12 / 13 / 01 ; hourahead hour : 5 ; no ancillary schedules awarded . no variances detected .\n log messages :\n parsing file - - > > o : \\ portland \\ westdesk \\ california scheduling \\ iso final schedules \\ 2001121305 . txt\nham\n------------\nSubject: cheapestt medss !\n mediccationns at lowesst pricess everyy !\n over 80 . % offf , pricess wontt get lowerr\n we selll vic ' od ( in v , ia . gra x , ana . x\n http : / / www . pr 3 sdlugs . com / ? refid = 87\n \n\nspam\n------------\nSubject: fw : picture\n >\n >\n > the following is an aerial photo of the wtc area . it kinda brings on\n > vertigo , but is a phenomenal shot .\n >\n > http : / / userwww . service . emory . edu / ~ rdgarr / wtc . htm\nham\n------------\n\n```\n\n\n\nling\\_spam 样本示例\n\n```\n------------\nSubject: internet specialist 007 - the spy\n \n\ninternet specialist 007 - the spy learn everything about your friends , neighbors , enemies , employees or anyone else ! - - even your boss ! - - even yourself ! this mammoth snoop collection of internet sites will provide you the newest and most current addresses available on the net today . . . = 20 * track down an old debt , or anyone else that has done you wrong ! it 's incredible , and so many new data sites have come online in the past 90 days . . . * over 300 giant resources to look up people , credit , social security , current or past employment , mail order purchases , = 20 addresses , phone numbers , maps to city locations . . . * investigate your family history ! check birth , death , adoption or social security records check service records or army , navy , air force or = 20 marine corps . * locate an old friend ( or an enemy who is hiding ) or a lost = 20 love - - find e-mail , telephone or address information on anyone ! = 20 even look up * unlisted * phone numbers ! * find work by searching classified ads all over the world ! * screen prospective employees - - check credit , driving or criminal records verify income or educational accomplishments = 20 * check out your daughter 's new boyfriend ! * find trial transcripts and court orders ! * enjoy the enchantment of finding out a juicy tid-bit about a co-worker . the internet is a powerful megasource of information , = 20 if you only know where to look . i tell you how to find = 20 out nearly anything about anybody , and tell you exactly where to find it ! you will be amazed to find out what personal information = 20 other people can find out about you ! check your credit = 20 report so you can correct wrong information that may be = 20 used to deny you credit . research yourself first ! you ' ll be horrified , as i was , = 20 at how much data has been accumulated about you . any my huge collection is only the beginning ! once you = 20 locate these free private , college and government web sites , you ' ll find even more links to even more = 20 information search engines ! = 20 if you believe ( like i do ) that the information that is stored about each one of us should be freely accessible , you ' ll want to see the snoop collection i ' ve compiled . verify your own records , or find out what you need to = 20 know about others . i ' m telling you , it 's incredible what you can find out using the internet ! we will accept checks by fax at 813-269 - 9651 or > > > send $ 14 . 95 cash , check or money order to : > > > the coldwell group > > > p . o . box 3787 > > > dept 1007 > > > petersburg , va 23805 i will rush back to you my snoop information for fastest service include your * e-mail * address . = 20 * what information is available - - and exact url to get there ! * exactly where to look for - - and the clever way to use - - = 20 the above search engines , and tons more ! * my easy-to - browse categorized megacenter of information has my own description of how to use each site , and what you ' ll find when you get there - - and tricky tips on how to = 20 extract the best data ! you can know everything about everybody with this internet specialist collection ! * * soon to be available - - the most complete international internet spy = 20 sites available on the web today * * don ' t miss this one or you ' ll be sorry = 20 to be removed from our list please fax your address to 813-269 - 9651 . l = e3 = 01 @ u = 0b\n \n\nspam\n------------\nSubject: usage - based models - symposium\n \n\nannouncing the sixth biennial symposium of the rice university department of linguistics usage-based models of language rice university march 15-18 , 1995 invited speakers : mira ariel tel aviv university joan bybee university of new mexico john du bois university of california , santa barbara michael israel university of california , san diego sydney lamb rice university ronald langacker university of california , san diego tom givon university of oregon brian macwhinney carnegie - mellon university janet pierrehumbert northwestern university john sinclair university of birmingham ( u . k . ) arie verhagen university of utrecht description : the goal of this symposium is to explore approaches to linguistic theory that have in common the aim of accounting for linguistic usage . the empirical data for such theories is not restricted to linguistic intuitions about acceptibility , but comes from usage events of varied types . the focus is on the patterns found in the various sorts of usage data examined , and how those patterns can be extracted , represented , and used by the human mind . research from a variety of traditions will be represented , including corpus-based analyses , discourse studies , experimental studies of language processing and language acquisition , and instrumental phonetics . the approaches taken can be called data-driven , rather than model-driven , in that the fewest possible prior assumptions are made about what types of data are relevant , and that large sets of usage events are observed so that the detailed patterns found in actual usage can emerge . moreover , the various approaches taken show signs of converging toward a view of language as a dynamic system in which linguistic knowledge is not separate from its processing in language use . the linguistic models representing this view are usage-based by virtue of three factors : ( 1 ) the importance placed on usage data for theory construction ; ( 2 ) the direct incorporation of processing ( production and comprehension ) into linguistic theory ; and ( 3 ) the requirement that the models arrived at , whatever the direct source of evidence , must be testable with reference to language use . registration : no charge . symposium attendance on a space-available basis . for further information , contact suzanne kemmer ( kemmer @ ruf . rice . edu ) or michael barlow ( barlow @ ruf . rice . edu ) snailmail : dept . of linguistics , rice university , houston tx 77251-1892 .\n \n\nham\n------------\nSubject: domani\n \n\nnew improved with free software , free bulk e mail system , free web site = to do what you wish , ongoing support ( optional ) , and a lot more ! all = included . . . . . . . . . . . this is a one time mailing . . . . . . . . . . . . . . . \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ $ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ \\ = \\ \\ \\ \\ \\ you are about to make at least $ 50 , 000 in less than 90 days read the enclosed program . . . then read it again . . . / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / = / / / / / / / dear friend , the enclosed information is something i almost let slip through my fingers . fortunately , sometime later i re-read everything and gave some thought and study to it . my name is christopher erickson . two years ago , the corporation i worked at for the past twelve years down-sized and my position was eliminated . after unproductive job interviews , i decided to open my own business . over the past year , i incurred many unforeseen financial problems . i owed my family , friends , and creditors over $ 35 , 000 . the economy was taking a toll on my business and i just could n't seem to make ends meet . i had to refinance and borrow against my home to support my family and struggling business . i truly believe it was wrong for me to be in debt like this . at that moment something significant happened in my life and i am writing to share my experience in hopes that this will change your life forever . . . . financially ! ! ! in mid - december , i received this program via email . six months prior to receiving this program i had been sending away for information on various business opportunities . all of the programs i received , in my opinion , were not cost effective . they were either too difficult for me to comprehend or the initial investment was too much for me to risk to see if they worked or not . one claimed i 'd make a million dollars in one year . . . it did n't tell me i 'd have to write a book to make it . but like i was saying , in december of ' 92 i received this program . i did n't send for it , or ask for it , they just got my name off a mailing list . thank goodness for that ! ! ! after reading it several times , to = make sure i was reading it correctly , i could n't believe my eyes . = 20 here was a money-making phenomenon . i could invest as much as i wanted = to start , without putting me further in debt . after i got a pencil and paper and figured it out , i would at least get my money back . after determining that the program is legal and not a chain letter , i decided \" why not \" . initially i sent out 10 , 000 emails . it only cost me about $ 15 . 00 for my time on-line . the great thing about email is that i did n't need any money for printing to send out the program , only the cost to fulfill my orders . i am telling you like it is , i hope it does n't turn you off , but i promised myself that i would not \" rip-off \" anyone , no matter how much money it cost me ! . in less than one week , i was starting to receive orders for report # 1 . by january 13th , i had received 26 orders for report # 1 . when you read the guarantee in the program , you will see that \" you must receive = 15 to 20 orders for report # 1 within two weeks . if you don ' t , send out = more programs until you do ! \" my first step in making $ 50 , 000 in 20 to = 90 days was done . by january 30th , i had received 196 orders for report = # 2 . if you go back to the guarantee , \" you must receive 100 or more orders for report # 2 within two weeks . if not , send out more = programs until you do . once you have 100 orders , the rest is easy , = relax , you will make your $ 50 , 000 goal . \" well , i had 196 orders for = report # 2 , 96 more than i needed . so i sat back and relaxed . by march = 19th , of my emailing of 10 , 000 , i received $ 58 , 000 with more coming in = every day . i paid off all my debts and bought a much needed new car . please take time to read the attached program , it will change your life forever ! remember , it wont work if you do n't try it . this program does work , but you must follow it exactly ! especially the rules of not trying to place your name in a different place . it does n't work , you ' ll lose out on a lot of money ! report # 2 explains this . = 20 always follow the guarantee , 15 to 20 orders for report # 1 , and 100 or more orders for report # 2 and you will make $ 50 , 000 or more in 20 to 90 days . i am living proof that it works ! ! ! if you choose not to participate in this program , i ' m sorry . it really is a great opportunity with little cost or risk to you . if you choose to participate , follow the program and you will be on your way to financial security . if you are a fellow business owner and you are in financial trouble like i was , or you want to start your own business , consider this a sign . i did ! sincerely , christopher erickson ps do you have any idea what 11 , 700 $ 5 bills ( $ 58 , 000 ) look like piled up on a kitchen table ? it ' s awesome ! \" threw it away \" \" i had received this program before . i threw it away , but later wondered if i should n't have given it a try . of course , i had no idea who to contact to get a copy , so i had to wait until i was emailed another copy of the program . eleven months passed , then it came . i didn ' t throw this one away . i made $ 41 , 000 on the first try . \" dawn w . , evansville , in \" no free lunch \" \" my late father always told me , ' remember , alan , there is no free lunch in life . you get out of life what you put into it . ' through trial and error and a somewhat slow frustrating start , i finally figured it out . the program works very well , i just had to find the right target group of people to email it to . so far this year , i have made over $ 63 , 000 using this program . i know my dad would have been very proud of me . \" alan b . , philadelphia , pa a personal note from the originator of this program by the time you have read the enclosed information and looked over the enclosed program and reports , you should have concluded that such a program , and one that is legal , could not have been created by an amateur . let me tell you a little about myself . i had a profitable business for ten years . then in 1979 my business began falling off . i was doing the same things that were previously successful for me , but it was n't working . finally , i figured it out . it was n't me , it was the economy . inflation and recession had replaced the stable economy that had been with us since 1945 . i do n't have to tell you what happened to the unemployment rate . . . because many of you know from first hand experience . there were more failures and bankruptcies than ever before . the middle class was vanishing . those who knew what they were doing = invested wisely and moved up . those who did not , including those who = never had anything to save or invest , were moving down into the ranks of = the poor . as the saying goes , \" the rich get richer and the poor get = poorer . \" the traditional methods of making money will never allow you = to \" move up \" or \" get rich \" , inflation will see to that . you have just received information that can give you financial freedom for the rest of your life , with \" no risk \" and \" just a little bit of effort . \" you can make more money in the next few months than you have = ever imagined . i should also point out that i will not see a penny of your money , nor anyone else who has provided a testimonial for this program . i have already made over four million dollars ! i have retired from the program after sending out over 16 , 000 programs . now i have several offices which market this and several other programs here in the us and overseas . by the spring , we wish to market the ' internet ' by a partnership with america on line . follow the program exactly as instructed . do not change it in any way . = it works exceedingly well as it is now . remember to email a copy of = this exciting program to everyone that you can think of . one of the people you send this to may send out 50 , 000 . . . and your name will be on every one of them ! . remember though , the more you send out , the = more potential customers you will reach . so my friend , i have given you the ideas , information , materials and opportunity to become financially independent , it is up to you now ! \" think about it \" before you delete this program from your mailbox , as i almost did , take a little time to read it and really think about it . get a pencil and figure out what could happen when you participate . figure out the worst possible response and no matter how you calculate it , you will still make a lot of money ! definitely get back what you invested . = 20 any doubts you have will vanish when your first orders come in . it works ! paul johnson , raleigh , nc here ' s how this amazing program will make you $ $ $ $ $ $ let 's say that you decide to start small , just to see how it goes , and we ' ll assume you and all those involved send out 2 , 000 programs each . let 's also assume that the mailing receives a . 5 % response . using a good list the response could be much better . also many people will send out hundreds of thousands of programs instead of 2 , 000 . but continuing with this example , you send out only 2 , 000 programs . with a . 5 % response , that is only 10 orders for report # 1 . those 10 people respond by sending out 2 , 000 programs each for a total of 20 , 000 . out of those . 5 % , 100 people respond and order report # 2 . those 100 mail out 2 , 000 programs each for a total of 200 , 000 . the . 5 % response to that is 1 , 000 orders for report # 3 . those 1 , 000 send out 2 , 000 programs each for a 2 , 000 , 000 total . the . 5 % response to that is 10 , 000 orders for report # 4 . that 's 10 , 000 five dollar bills for you . cash ! ! ! ! your total income in this example is $ 50 + $ 500 + $ 5000 + $ 50 , 000 for a total of $ 55 , 550 ! ! ! ! remember friend , this is assuming 1 , 990 out of 2 , 000 people you mail to = will do absolutely nothing . . . and trash this program ! dare to think for = a moment what would happen if everyone or half sent out 100 , 000 programs instead of only 2 , 000 . believe me , many people will do = that and more ! by the way , your cost to participate in this is = practically nothing . you obviously already have an internet connection and email is free ! ! ! report # 3 will show you the best methods for bulk emailing and purchasing email lists . this is a legitimate , legal , money making opportunity . it does not require you to come in contact with people , do any hard work , and best of all , you never have to leave the house except to get the mail . if you believe that someday you ' ll get that big break that you ' ve been waiting for , this is it ! simply follow the instructions , and your dream will come true . this multi-level email order marketing program works perfectly . . . 100 % every time . email is the sales tool of the future . take advantage of this non-commercialized method of advertising now ! ! the longer you wait , the more people will be doing business using email . get your piece of this action ! ! multi-level marketing ( mlm ) has finally gained respectability . it is = being taught in the harvard business school , and both stanford research and the wall street journal have stated that between 50 % and = 65 % of all goods and services will be sold throughout multi - level methods by the mid to late 1990 's . this is a multi - billion dollar industry and of the 500 , 000 millionaires in the us , 20 % ( 100 , 000 ) made their fortune in the last several years in mlm . moreover , statistics show 45 people become millionaires everyday through multi - level marketing . instructions we at erris mail order marketing business , have a method of raising capital that really works 100 % every time . i am sure that you could use = $ 50 , 000 to $ 125 , 000 in the next 20 to 90 days . before you say \" bull \" , please read the program carefully . this is not a chain letter , but a perfectly legal money making opportunity . basically , this is what we do : as with all multi-level business , we build our business by recruiting new partners and selling our products . every state in the usa allows you to recruit new multi - level business partners , and we offer a product for every dollar sent . your orders come and are filled through the mail , so you are not = involved in personal selling . you do it privately in your own home , = store or office . this is the greatest multi - level mail order marketing anywhere : step ( 1 ) order all four 4 reports listed by name and number . dothis by ordering the report from each of the four 4 names listed on the next page . for each report , send $ 5 cash and a self - addressed , stamped envelope ( business size # 10 ) = to the person listed for the specific report . international = = 20 orders should also include $ 2 extra for postage . it is essential that you specify the name and number of the report requested to the person you are ordering from . you will need all four 4 reports because you will be reprinting and reselling them . do not alter the names or sequence other than what the instructions say . important : always provide same-day service on all orders . step ( 2 ) replace the name and address under report # 1 with yours , moving the one that was there down to report # 2 . drop the name and address under report # 2 to report # 3 , moving the one that was there to report # 4 . the name and address that was under report # 4 is dropped from the list and this party is no doubt on the way to the bank . when doing this , make certain you type the names and addresses accurately ! do not mix up moving product / report positions ! ! ! step ( 3 ) having made the required changes in the name list , save it as a text ( . txt ) file in it 's own directory to be used with whatever email program you like . again , report # 3 will tell you the best methods of bulk emailing and acquiring email lists . step ( 4 ) email a copy of the entire program ( all of this is very important ) to everyone whose address you can get your hands on . start with friends and relatives since you can encourage them to take advantage of this fabulous = 20 money-making opportunity . that 's what i did . and they love me now , more than ever . then , email to anyone and everyone ! use your imagination ! you can get email addresses from companies on the internet who specialize in email mailing lists . these are very cheap , 100 , 000 addresses for around $ 35 . 00 . important : you won't get a good response if you use an old list , so always request a fresh , new list . you will find out where to purchase these lists when you order the four 4 reports . always provide same-day service on all orders ! ! ! required reports * * * order each report by number and name * * * always send a self-addressed , stamped envelope and $ 5 usd cash for each order requesting the specific report by name and number ( international orders should also include $ 2 usd extra for postage ) = 20 add you e amil address when sending in for your report this is for = updated information and continueing support ( optional ) that will be = handed down by you sponcers . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 1 \" how to make $ 250 , 000 through multi-level sales \" order report # 1 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 2 \" major corporations and multi-level sales \" j . maz 15774 s . lagrange rd suite # 312 orland pk , il 60462 usa _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ order report # 2 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 3 \" sources for the best mailing lists \" order report # 3 from : b . thompson 13504 greencaslte ridge tr . 404 burtonsville md . 20866 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 4 \" evaluating multi-level sales plans \" order report # 4 from : muw # 2 po box 71442 salt lake city , ut 84171-0442 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ conclusion . i am enjoying my fortune that i made by sending out this program . you too , will be making money in 20 to 90 days , if you follow the simple steps outlined in this mailing . to be financially independent is to be free . free to make financial decisions as never before . go into business , get into investments , retire or take a vacation . = 20 = = = = = = 20\n \n\nspam\n------------\nSubject: linguistic datasources\n \n\nat the request of subscribers , we ' ve been collecting the addresses of linguistic datasources which can be reached through world wide web . these addresses are now available to any of you who have web access on the linguist web server at the following url : http : / / engserve . tamu . edu / files / linguistics / linguist / datasources . html this file is also available , to those of you who read web - linguist , through the \" linguistic datasources \" link . we 'd be delighted to hear any comments anyone would care to make . and if there ' re any addresses we need to add , please let us know what they are . we 'd like to emphasize that we 'd be happy to include sites where individual linguists keep data they would like to make available to their colleagues . since the web allows us to share not merely text , but pictures and sound-recordings , we can now begin an interchange of linguistic information that is of a very different nature from that which was possible in the past . anthony & helen\n \n\nham\n------------\n\n```\n\n\n\nsms\\_spam 样本示例\n\n```\n------------\nShe's borderline but yeah whatever.\nham\n------------\nURGENT! You have won a 1 week FREE membership in our £100,000 Prize Jackpot! Txt the word: CLAIM to No: 81010 T&C URL LCCLTD POBOX 4403LDNW1A7RW18\nspam\n------------\nI got a call from a landline number. . . I am asked to come to anna nagar . . . I will go in the afternoon\nham\n------------\nFree Msg: get Gnarls Barkleys \"Crazy\" ringtone TOTALLY FREE just reply GO to this message right now!\nspam\n------------\nI will be gentle princess! We will make sweet gentle love...\nham\n------------\nRefused a loan? Secured or Unsecured? Can't get credit? Call free now 0800 195 6669 or text back 'help' & we will!\nspam\n------------\nSorry, went to bed early, nightnight\nham\n------------\nFree entry in 2 a wkly comp to win FA Cup final tkts 21st May 2005. Text FA to 87121 to receive entry question(std txt rate)T&C's apply 08452810075over18's\nspam\n------------\nHMM yeah if your not too grooved out! And im looking forward to my pound special :)\nham\n------------\nFreeMsg Hey there darling it's been 3 week's now and no word back! I'd like some fun you up for it still? Tb ok! XxX std chgs to send, £1.50 to rcv\nspam\n------------\nHi dude hw r u da realy mising u today\nham\n------------\nWINNER!! As a valued network customer you have been selected to receivea £900 prize reward! To claim call 09061701461. Claim code KL341. Valid 12 hours only.\nspam\n------------\n\n```\n\n\n\nspam\\_assassin 样本示例\n\n```\n------------\n\"\\n>>>Robert Elz said:\\n > Date: Wed, 28 Aug 2002 09:22:34 -0500\\n > From: Chris Garrigues \\n > Message-ID: <1030544555.28815.TMDA@URL>\\n > \\n > \\n > | so I'll probably poke around at the sequences performance issues,\\n > \\n > Well, there's this wonderful piece of code in MhSeqExpand ...\\n > \\n > # Hack to weed out sequence numbers for messages that don't exist\\n > foreach m $rseq {\\n > if ![file exists $mhProfile(path)/$folder/$m] {\\n > Exmh\\_Debug $mhProfile(path)/$folder/$m not found\\n > set ix [lsearch $seq $m]\\n > set seq [lreplace $seq $ix $ix]\\n > } else {\\n > # Real hack\\n\\nAt least I'm up-front about my hacks :-)\\n\\n > break\\n > }\\n > }\\n > \\n > which is going to run slow if a sequence happens to start with a bunch\\n > of messages that don't exist. I'm not sure why it is important that the\\n > first message in the sequence returned exists, but not necessarily any\\n > of the others, but I'm sure glad it is, as MhSeqExpand gets called lots,\\n > and I don't know if I could cope if it were checking every file in the\\n > sequences it is looking at, all the time...\\n\\nThat was my thinking. My recollection about the first message being valid\\nis that the ftoc code wants to find that message to start its highlighting,\\nfor example, or you are selecting a message to display.\\n\\n > It may help to keep a list of the valid message numbers for the current\\n > folder (though that would then need to be verified against changes to the\\n > directory). Does tcl have a directory read function? I assume so...\\n\\nglob -nocomplain $mhProfile(path)/$folder \\*\\nwill return an unsorted list of the directory's contents.\\nBut the thought of keeping an in memory list of valid messages is not fun.\\nExmh already maintains in-core lists of messages in sequences, which is\\nalready pretty tricky\\n\\n > Mh\\_Sequence also goes and rereads the files (.mh\\_sequences and the\\n > context file) but I'm not sure how frequently that one is called.\\n\\nIn some places I maintain caches of files by checking their modify time,\\nbut the sequence files are soo small that by the time you stat them to\\ncheck their date stamp, you could just read them again. Also, now that\\nwe checkpoint message state on every message view, that file will change\\nevery time. In the old days exmh used to cache a bunch of state about\\nthe folder.\\n\\n--\\nBrent Welch\\nSoftware Architect, Panasas Inc\\nPioneering the World's Most Scalable and Agile Storage Network\\URL\\nwelch@URL\\n\\n\\n\\n\\n\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\nExmh-workers mailing list\\nExmh-workers@URL\\nhttps://URL\nspam\n------------\n\"Whiter teeth and a brighter smile are just a click away! \\nhttp://URL you considered professional teeth whitening? If so, you know it usually costs between $300 and $500 from your local dentist!\\n\\nVisit our site to learn how to professionally whiten your teeth, using the exact same whitening system your dentist uses, at a fraction of the cost!\\n\\nWe know our product is the best on the market, and we back it with a 30 day money back guarantee!\\n\\nClick here to find out more!\\nhttp://URL \\n \\n \\n \\n \\n---------------------------------------\\nTo easily remove your address from the list, go to: \\nhttp://URL allow 48-72 hours for removal.\\n\\n\"\nham\n------------\n\"> Hi,\\n> \\n> On Sun, 01 Sep 2002 00:05:03 MDT Reg Clemens wrote: \\n> \\n> [...]\\n> > in messages with GnuPG signatures. But punching the line ALWAYS\\n> > gives\\n> > \\n> > Signature made Thu Aug 29 00:27:17 2002 MDT using DSA key ID BDDF997A\\n> > Can't check signature: public key not found\\n> > \\n> > So, something else is missing.\\n> \\n> Yes, the public key of the signature you want to check :-).\\n> \\n> Are you really sure that you have the public key of the message's\\n> signature? If not, try downloading it or try to check a signature from\\n> which you know you have the public key.\\n> \\n> \\n> \\n\\nAh, sorry for not making that clearer.\\nBut no.\\nPreviously (v1.0.6 of GnuPG) there would be a slight pause at this point while\\nit went out to get the public key from a keyserver.\\nNow, whether I have the key or NOT, I get the failure message.\\n\\nIts as if it cant find gpg to execute it (but I fixed that path), so there\\nmust be something else that I am missing...\\n\\n\\n-- \\n Reg.Clemens\\n reg@URL\\n\\n\\n\\n\\n\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\_\\nExmh-users mailing list\\nExmh-users@URL\\nhttps://URL\nspam\n------------\n\"Below is the result of your feedback form. It was submitted by\\n (jamie@URL) on Tuesday, August 27, 2002 at 22:26:56\\n---------------------------------------------------------------------------\\n\\n:: [click here](\\\"URL\\\")Don't want to pay for Porn? Would you like to get it for FREE? The Honest, No Risk 100% Free Way? If you take a couple of minutes and read through our simple guide you will be able to get Free Passes to the top Paysites online![click here](\\\"URL\\\")\\n\\n---------------------------------------------------------------------------\\n\\n\\n\"\nham\n------------\n\n```\n\n\n\nspam\\_detection 样本示例\n\n```\n------------\nLooking for an epic way to boost your followers? Want to increase your likes? Well, listen up! Our site has the ultimate solution for you! With our automated system, you can get hundreds of followers in seconds! Not only that, but our team of social media experts will also provide you with the latest and greatest tips and tricks to make your page go viral! Don't wait, join the millions of successful users who have already transformed their social media game with our help! \n \n\nspam\n------------\nI'm working on a bayesian network to genetare synthetic WIM observations such as gross vehicle weight, total vehicle length, individual axle load and inter axle distances. It will be very helpful to have data sets for different sites. Thanks\nham\n------------\nHey, check out these amazing deals just for you! \n \n\n Get rich quick! Click here to learn how! \n \n\n Need a date? We've got you covered! Join our dating site now! \n \n\n Burn fat fast with our miracle pill! Guaranteed results! \n \n\n Want to see what everyone's talking about? Click here for the latest viral videos! \n \n\n Don't miss out on our exclusive discount codes! Shop now and save big! \n \n\n Feeling lucky? Enter our sweepstakes for\nspam\n------------\nLooking for a dataset with the latest Omicron covid variant data and confirmed locations. Any links would be appreciated, thanks!\nham\n------------\n\n```\n\n\n\nspam\\_message 样本示例\n\n```\n------------\n欣彩印刷厂主营彩盒.吊牌.产品说明书及提供四色印刷对外加工.电话xxxxxxxxxxx米先生\nspam\n------------\n微软称Windows10是“有史以来最安全的Windows版本”\nham\n------------\n您好!欢迎致电川中意货运有限公司,本公司承接成都至恩施,武汉至恩施及周边县市,覆射全国各地的整车、零担往返运输业务。我们以“信誉第一,\nspam\n------------\n我用了9个月时间学会在医院的时候自己照顾自己这件事情\nham\n------------\n在元宵节到来之际\nspam\n------------\n丰县大队要求各中队加强日常执勤\nham\n------------\n欢迎致电开锁王。本公司公安备案,工商注册。经营范围:汽车锁,保险柜,防盗门及各种民用锁芯升级!联系电话xxxx-xxxxxxx\nspam\n------------\n扬州天姿现在基本也留不住高端顾客群体了\nham\n------------\n\n```\n\n\n\nspam\\_message\\_lr 样本示例\n\n```\n------------\n3G小贴士提醒您可不要让您的流量白白浪费了哦,快来唤醒吧!与您分享杨子黎2013全新单曲《爱人好累》MV 详情点击:URL\nspam\n------------\n巫倩云:来周总办公室现在\nham\n------------\n结婚娶亲本是一件高兴事,新郎却因一辆加长林肯车而当场落泪!这是为什么?详情请点击 URL\nspam\n------------\nPP提醒你来认证啦!在电脑上登录PP租车官方网站(URL)或下载PP租车手机客户端(URL)上传身份证和驾驶证照片,即可完成租客身份认证。600余款车型,低于市场价30%,随时随地取车,开启便捷用车新时代!【PP租车】\nham\n------------\n【联通飞影】对美女自作多情的后果… URL\nspam\n------------\n您已成功添加王然(13811083077)为好友,可以接收对方的飞信消息。回复本短信可直接与对方聊天。\nham\n------------\n棕盛商业广场一期5月18号火爆认筹,孟庭苇邀您亲见世界500强签约蕲春,VIP卡全城免费热办中。0713-7300000【棕盛商业地产】\nspam\n------------\n信用卡1707于10月23日16:19消费人民币360.07,记账金额以账单显示为准。加“浦发银行信用卡中心”官方微信查询/待续【浦发银行】\nham\n------------\n\n```", "### 参考来源\n\n\n\n参考的数据来源,展开查看\n\n```\n\nURL\nURL\nURL\nURL\nURL\nURL\n\n\n\n```" ]
[ 14, 14, 10296, 22 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n### 数据来源\n\n\n数据集从网上收集整理如下:" ]
dc39062efec7515add304b98a54da2948709a808
This is a mirror to the example dataset "The Song Describer Dataset: a Corpus of Audio Captions for Music-and-Language Evaluation" paper by Manco et al. Project page on Github: https://github.com/mulab-mir/song-describer-dataset Dataset on Zenodoo: https://zenodo.org/records/10072001 Explore the dataset on your local machine: ```python import datasets from renumics import spotlight ds = datasets.load_dataset('renumics/song-describer-dataset') spotlight.show(ds) ``` ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63dd29ffaf221a78fa4ec8d1/bcoTy3h_Q0uUUo-m880Iu.png)
renumics/song-describer-dataset
[ "region:us" ]
2023-12-04T08:13:28+00:00
{"dataset_info": {"features": [{"name": "caption_id", "dtype": "int64"}, {"name": "track_id", "dtype": "int64"}, {"name": "caption", "dtype": "string"}, {"name": "is_valid_subset", "dtype": "bool"}, {"name": "familiarity", "dtype": "int64"}, {"name": "artist_id", "dtype": "int64"}, {"name": "album_id", "dtype": "int64"}, {"name": "path", "dtype": {"audio": {"decode": false}}}, {"name": "duration", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3543021978.0, "num_examples": 746}], "download_size": 2591774807, "dataset_size": 3543021978.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T10:16:10+00:00
[]
[]
TAGS #region-us
This is a mirror to the example dataset "The Song Describer Dataset: a Corpus of Audio Captions for Music-and-Language Evaluation" paper by Manco et al. Project page on Github: URL Dataset on Zenodoo: URL Explore the dataset on your local machine: !image/png
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
945157ea700def9a1ef9ae454e1b5438eea3b2e2
# ChatGPT4 Code Instruct Originally from https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1, translate and answer using ChatGPT4. Notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/chatbot/chatgpt4-code-instruct - [synthetic-codealpaca-v1-chatgpt4.jsonl](synthetic-codealpaca-v1-chatgpt4.jsonl), 43482 rows, 274 MB. ## Example data ```python {'instruction': "Harap ubah skrip Python berikut agar ia memasukkan pengulangan 'while' daripada pengulangan 'for' yang sedia ada, yang meneruskan melalui item-item dalam senarai integer. \n\nSkrip semasa mempunyai bug di mana ia cuba mencetak objek yang berada di luar batas senarai. Betulkan ralat ini dan ubah skrip untuk menggunakan 'while' bukannya pengulangan 'for'. Pastikan skrip anda menangani senarai kosong dengan betul. \n\n```python\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Melalui senarai dan keluarkan setiap elemen individu\n for i in range(n+1):\n print(arr[i])\n```", 'output': 'Untuk mengubah skrip ini, kita Akkan menggunakan konsep pengulangan \'while\' dan memastikan index tidak melebihi panjang senarai. Juga, kita harus memastikan skrip kita menangani senarai kosong dengan betul. \n\nBerikut adalah cara mengubahnya:\n\n```python\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Pastikan senarai tidak kosong\n if n > 0:\n i = 0\n # Melalui senarai dan keluarkan setiap elemen individu\n while i < n:\n print(arr[i])\n i += 1\n else:\n print("Senarai adalah kosong.")\n```\n\nDalam skrip ini, sebelum kita mulai mengulang, kita memeriksa apakah `n`, yang merupakan panjang senarai, lebih besar dari 0. Jika tidak (yang berarti senarai kosong), kita mencetak pesan yang sesuai. \n\nKemudian, kami menggunakan pengulangan \'while\' untuk mengulang selama indeks `i` kurang dari panjang `n` senarai. Selama setiap iterasi, kita mencetak elemen senarai dan tambahkan 1 ke indeks, sehingga kita dapat bergerak ke elemen berikutnya pada iterasi berikutnya. \n\nIni akan mencegah terjadinya kesalahan index, karena kita tidak akan pernah mencoba mengakses indeks yang melebihi batas senarai.', 'instruction_en': "Please amend the subsequent Python script so that it includes a 'while' loop rather than the existing 'for' loop, which iterates through the items of an integer list.\n\nThe script currently has a bug where it attempts to print an object that is outside the bounds of the list. Fix this error and modify the script to use 'while' instead of 'for' loop. Ensure your script correctly handles empty lists. \n\n```python\n # Establish an integer list\n arr = [1, 2, 3, 4]\n\n # Determine the length of the list\n n = len(arr)\n\n # Traverse the list and output each individual element\n for i in range(n+1):\n print(arr[i])\n```", 'indon_ins': False, 'indon_output': True, 'rejected_ins': False, 'rejected_output': False, 'instruction_ms': "Sila ubah skrip Python berikut supaya ia memasukkan pengulangan 'while' daripada pengulangan 'for' yang sedia ada, yang meneruskan melalui item-item dalam senarai integer.\n\nSkrip semasa mempunyai bug di mana ia cuba mencetak objek yang berada di luar had senarai. Betulkan ralat ini dan ubah skrip untuk menggunakan 'while' bukannya pengulangan 'for'. Pastikan skrip anda mengendalikan senarai kosong dengan betul.\n\n```python\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Melalui senarai dan keluarkan setiap elemen individu\n for i in range(n+1):\n print(arr[i])\n```", 'output_ms': 'Untuk mengubah skrip ini, kita Akkan menggunakan konsep pengulangan \'while\' dan memastikan indeks tidak melebihi panjang senarai. Selain itu, kita harus memastikan skrip kita mengendalikan senarai kosong dengan betul.\n\nBerikut adalah cara untuk mengubahnya:\n\n```python\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Pastikan senarai tidak kosong\n if n > 0:\n i = 0\n # Melalui senarai dan keluarkan setiap elemen individu\n while i < n:\n print(arr[i])\n i += 1\n else:\n print("Senarai adalah kosong.")\n```\n\nDalam skrip ini, sebelum kita mula mengulangi, kita memeriksa sama ada `n`, yang merupakan panjang senarai, lebih besar daripada 0. Jika tidak (yang bermaksud senarai kosong), kita mencetak mesej yang sesuai.\n\nKemudian, kita menggunakan pengulangan \'while\' untuk mengulangi semasa indeks `i` kurang daripada panjang `n` senarai. Semasa setiap iterasi, kita mencetak elemen senarai dan menambahkan 1 ke indeks, supaya kita boleh bergerak ke elemen seterusnya pada iterasi seterusnya.\n\nIni akan mengelakkan ralat indeks, kerana kita tidak akan pernah mencuba mengakses indeks yang melebihi had senarai.'} ```
mesolitica/chatgpt4-code-instruct
[ "task_categories:conversational", "language:ms", "region:us" ]
2023-12-04T08:18:52+00:00
{"language": ["ms"], "task_categories": ["conversational"]}
2024-02-02T05:58:16+00:00
[]
[ "ms" ]
TAGS #task_categories-conversational #language-Malay (macrolanguage) #region-us
# ChatGPT4 Code Instruct Originally from URL translate and answer using ChatGPT4. Notebooks at URL - URL, 43482 rows, 274 MB. ## Example data python\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Melalui senarai dan keluarkan setiap elemen individu\n for i in range(n+1):\n print(arr[i])\npython\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Pastikan senarai tidak kosong\n if n > 0:\n i = 0\n # Melalui senarai dan keluarkan setiap elemen individu\n while i < n:\n print(arr[i])\n i += 1\n else:\n print("Senarai adalah kosong.")\npython\n # Establish an integer list\n arr = [1, 2, 3, 4]\n\n # Determine the length of the list\n n = len(arr)\n\n # Traverse the list and output each individual element\n for i in range(n+1):\n print(arr[i])\npython\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Melalui senarai dan keluarkan setiap elemen individu\n for i in range(n+1):\n print(arr[i])\npython\n # Menetapkan senarai integer\n arr = [1, 2, 3, 4]\n\n # Tentukan panjang senarai\n n = len(arr)\n\n # Pastikan senarai tidak kosong\n if n > 0:\n i = 0\n # Melalui senarai dan keluarkan setiap elemen individu\n while i < n:\n print(arr[i])\n i += 1\n else:\n print("Senarai adalah kosong.")\n
[ "# ChatGPT4 Code Instruct\n\nOriginally from URL translate and answer using ChatGPT4.\n\nNotebooks at URL\n\n- URL, 43482 rows, 274 MB.", "## Example data\n\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Melalui senarai dan keluarkan setiap elemen individu\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Pastikan senarai tidak kosong\\n if n > 0:\\n i = 0\\n # Melalui senarai dan keluarkan setiap elemen individu\\n while i < n:\\n print(arr[i])\\n i += 1\\n else:\\n print(\"Senarai adalah kosong.\")\\npython\\n # Establish an integer list\\n arr = [1, 2, 3, 4]\\n\\n # Determine the length of the list\\n n = len(arr)\\n\\n # Traverse the list and output each individual element\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Melalui senarai dan keluarkan setiap elemen individu\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Pastikan senarai tidak kosong\\n if n > 0:\\n i = 0\\n # Melalui senarai dan keluarkan setiap elemen individu\\n while i < n:\\n print(arr[i])\\n i += 1\\n else:\\n print(\"Senarai adalah kosong.\")\\n" ]
[ "TAGS\n#task_categories-conversational #language-Malay (macrolanguage) #region-us \n", "# ChatGPT4 Code Instruct\n\nOriginally from URL translate and answer using ChatGPT4.\n\nNotebooks at URL\n\n- URL, 43482 rows, 274 MB.", "## Example data\n\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Melalui senarai dan keluarkan setiap elemen individu\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Pastikan senarai tidak kosong\\n if n > 0:\\n i = 0\\n # Melalui senarai dan keluarkan setiap elemen individu\\n while i < n:\\n print(arr[i])\\n i += 1\\n else:\\n print(\"Senarai adalah kosong.\")\\npython\\n # Establish an integer list\\n arr = [1, 2, 3, 4]\\n\\n # Determine the length of the list\\n n = len(arr)\\n\\n # Traverse the list and output each individual element\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Melalui senarai dan keluarkan setiap elemen individu\\n for i in range(n+1):\\n print(arr[i])\\npython\\n # Menetapkan senarai integer\\n arr = [1, 2, 3, 4]\\n\\n # Tentukan panjang senarai\\n n = len(arr)\\n\\n # Pastikan senarai tidak kosong\\n if n > 0:\\n i = 0\\n # Melalui senarai dan keluarkan setiap elemen individu\\n while i < n:\\n print(arr[i])\\n i += 1\\n else:\\n print(\"Senarai adalah kosong.\")\\n" ]
[ 26, 40, 473 ]
[ "passage: TAGS\n#task_categories-conversational #language-Malay (macrolanguage) #region-us \n# ChatGPT4 Code Instruct\n\nOriginally from URL translate and answer using ChatGPT4.\n\nNotebooks at URL\n\n- URL, 43482 rows, 274 MB." ]
4fdb4e54b6ba7a72e539a62671453a8cf1bea55c
https://www.ehu.eus/ccwintco/index.php?title=Hyperspectral_Remote_Sensing_Scenes#Indian_Pines
Tangliwen/Indian_Pines
[ "region:us" ]
2023-12-04T08:25:40+00:00
{}
2023-12-04T08:26:27+00:00
[]
[]
TAGS #region-us
URL
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
286a8bb25f0825f29f197bd206cfa38d5a30648d
The dataset comprises over four hundred thousand handwritten names obtained from charitable initiatives. Character Recognition employs image processing techniques to transform characters present on scanned documents into digital formats. It generally exhibits good performance with machine-printed fonts. Nonetheless, machines still encounter formidable obstacles in accurately identifying handwritten characters due to the vast diversity in individual writing styles. The total number of first names is 206,799, while the total number of surnames is 207,024. The data was partitioned into a training set (331,059 samples), testing set (41,382 samples), and validation set (41,382 samples) respectively. FYI: I am not the owner of this dataset. I took this dataset from kaggle. It is very interesting and useful dataset for many computer vision application. <a href="http://projectcentersinchennai.co.in" title="Project Centers in Chennai">Project Centers in Chennai</a>
gymprathap/Handwriting-Recognition-Dataset
[ "task_categories:token-classification", "size_categories:1K<n<10K", "language:en", "license:apache-2.0", "code", "region:us" ]
2023-12-04T08:32:24+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["token-classification"], "pretty_name": "Handwriting Recognition Dataset", "tags": ["code"]}
2023-12-04T12:51:52+00:00
[]
[ "en" ]
TAGS #task_categories-token-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #code #region-us
The dataset comprises over four hundred thousand handwritten names obtained from charitable initiatives. Character Recognition employs image processing techniques to transform characters present on scanned documents into digital formats. It generally exhibits good performance with machine-printed fonts. Nonetheless, machines still encounter formidable obstacles in accurately identifying handwritten characters due to the vast diversity in individual writing styles. The total number of first names is 206,799, while the total number of surnames is 207,024. The data was partitioned into a training set (331,059 samples), testing set (41,382 samples), and validation set (41,382 samples) respectively. FYI: I am not the owner of this dataset. I took this dataset from kaggle. It is very interesting and useful dataset for many computer vision application. <a href="URL" title="Project Centers in Chennai">Project Centers in Chennai</a>
[]
[ "TAGS\n#task_categories-token-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #code #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#task_categories-token-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #code #region-us \n" ]
9437b83cd317e01f3a8011f5d9eef5c00eede83f
# Dataset Card for "kor_aeslc" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) # Source Data Citation Information ``` @inproceedings{zhang-tetreault-2019-email, title = "This Email Could Save Your Life: Introducing the Task of Email Subject Line Generation", author = "Zhang, Rui and Tetreault, Joel", booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics", month = jul, year = "2019", address = "Florence, Italy", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/P19-1043", doi = "10.18653/v1/P19-1043", pages = "446--456", } ```
KETI-AIR/kor_aeslc
[ "license:cc-by-nc-sa-4.0", "region:us" ]
2023-12-04T08:52:08+00:00
{"license": "cc-by-nc-sa-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "data_index_by_user", "dtype": "int32"}, {"name": "subject_line", "dtype": "string"}, {"name": "email_body", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11852680, "num_examples": 14436}, {"name": "validation", "num_bytes": 1699457, "num_examples": 1960}, {"name": "test", "num_bytes": 1539749, "num_examples": 1906}], "download_size": 8634308, "dataset_size": 15091886}}
2023-12-05T01:48:14+00:00
[]
[]
TAGS #license-cc-by-nc-sa-4.0 #region-us
# Dataset Card for "kor_aeslc" More Information needed # Source Data Citation Information
[ "# Dataset Card for \"kor_aeslc\"\n\nMore Information needed", "# Source Data Citation Information" ]
[ "TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n", "# Dataset Card for \"kor_aeslc\"\n\nMore Information needed", "# Source Data Citation Information" ]
[ 19, 15, 6 ]
[ "passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n# Dataset Card for \"kor_aeslc\"\n\nMore Information needed# Source Data Citation Information" ]
2083a7e30878af2993632b2fc3565ed4a2159534
# Dataset Card for "h-and-m-fashion-caption" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tomytjandra/h-and-m-fashion-caption
[ "region:us" ]
2023-12-04T09:30:19+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 7843224039.084, "num_examples": 20491}], "download_size": 6302088359, "dataset_size": 7843224039.084}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T11:07:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for "h-and-m-fashion-caption" More Information needed
[ "# Dataset Card for \"h-and-m-fashion-caption\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"h-and-m-fashion-caption\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"h-and-m-fashion-caption\"\n\nMore Information needed" ]
c966383b4dff05cd0f5b16b31e7b81421af12832
# This dataset includes TheGambler novel, reformated for QA QSI and is based on: https://github.com/Priya22/project-dialogism-novel-corpus/tree/master/data Thank you, autors of the original dataset! ## See all my research files at: https://github.com/Kkordik/NovelQSI/tree/main
Kkordik/NovelQSI
[ "task_categories:question-answering", "size_categories:n<1K", "language:en", "license:apache-2.0", "region:us" ]
2023-12-04T09:36:10+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "pretty_name": "NovelQSI", "dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int64"}, {"name": "text", "sequence": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5783487.46716698, "num_examples": 746}, {"name": "test", "num_bytes": 2480852.5328330207, "num_examples": 320}], "download_size": 3519056, "dataset_size": 8264340}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-04T21:42:28+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #size_categories-n<1K #language-English #license-apache-2.0 #region-us
# This dataset includes TheGambler novel, reformated for QA QSI and is based on: URL Thank you, autors of the original dataset! ## See all my research files at: URL
[ "# This dataset includes TheGambler novel, reformated for QA QSI and is based on:\n\nURL\n\nThank you, autors of the original dataset!", "## See all my research files at: \n\nURL" ]
[ "TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #license-apache-2.0 #region-us \n", "# This dataset includes TheGambler novel, reformated for QA QSI and is based on:\n\nURL\n\nThank you, autors of the original dataset!", "## See all my research files at: \n\nURL" ]
[ 40, 35, 9 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #license-apache-2.0 #region-us \n# This dataset includes TheGambler novel, reformated for QA QSI and is based on:\n\nURL\n\nThank you, autors of the original dataset!## See all my research files at: \n\nURL" ]
95033c738a86ebb9fac1fa491648ea0ebdd4edeb
# TextGPT4V 30K Dataset Card ## Dataset details **Dataset type:** TextGPT4V 30K is a set of GPT4V generated instruction data to enchance the textual capabilities of VLMs. Constructed with prompting 'gpt-4-vision-preview'. **Dataset date:** TextGPT4V 30K was collected in Nov 2023, by prompting 'gpt-4-vision-preview' API. **Paper or resources for more information:** https://github.com/Etelis/TextGPT4V **License:** Attribution-NonCommercial 4.0 International It should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use
pig4431/TextGPT4V
[ "task_categories:visual-question-answering", "task_categories:question-answering", "size_categories:10K<n<100K", "language:en", "region:us" ]
2023-12-04T09:46:55+00:00
{"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["visual-question-answering", "question-answering"], "pretty_name": "TextGPT4V"}
2023-12-04T09:54:06+00:00
[]
[ "en" ]
TAGS #task_categories-visual-question-answering #task_categories-question-answering #size_categories-10K<n<100K #language-English #region-us
# TextGPT4V 30K Dataset Card ## Dataset details Dataset type: TextGPT4V 30K is a set of GPT4V generated instruction data to enchance the textual capabilities of VLMs. Constructed with prompting 'gpt-4-vision-preview'. Dataset date: TextGPT4V 30K was collected in Nov 2023, by prompting 'gpt-4-vision-preview' API. Paper or resources for more information: URL License: Attribution-NonCommercial 4.0 International It should abide by the policy of OpenAI: URL
[ "# TextGPT4V 30K Dataset Card", "## Dataset details\n\nDataset type:\nTextGPT4V 30K is a set of GPT4V generated instruction data to enchance the textual capabilities of VLMs. \nConstructed with prompting 'gpt-4-vision-preview'.\n\nDataset date:\nTextGPT4V 30K was collected in Nov 2023, by prompting 'gpt-4-vision-preview' API.\n\nPaper or resources for more information:\nURL\n\nLicense:\nAttribution-NonCommercial 4.0 International\nIt should abide by the policy of OpenAI: URL" ]
[ "TAGS\n#task_categories-visual-question-answering #task_categories-question-answering #size_categories-10K<n<100K #language-English #region-us \n", "# TextGPT4V 30K Dataset Card", "## Dataset details\n\nDataset type:\nTextGPT4V 30K is a set of GPT4V generated instruction data to enchance the textual capabilities of VLMs. \nConstructed with prompting 'gpt-4-vision-preview'.\n\nDataset date:\nTextGPT4V 30K was collected in Nov 2023, by prompting 'gpt-4-vision-preview' API.\n\nPaper or resources for more information:\nURL\n\nLicense:\nAttribution-NonCommercial 4.0 International\nIt should abide by the policy of OpenAI: URL" ]
[ 49, 11, 121 ]
[ "passage: TAGS\n#task_categories-visual-question-answering #task_categories-question-answering #size_categories-10K<n<100K #language-English #region-us \n# TextGPT4V 30K Dataset Card## Dataset details\n\nDataset type:\nTextGPT4V 30K is a set of GPT4V generated instruction data to enchance the textual capabilities of VLMs. \nConstructed with prompting 'gpt-4-vision-preview'.\n\nDataset date:\nTextGPT4V 30K was collected in Nov 2023, by prompting 'gpt-4-vision-preview' API.\n\nPaper or resources for more information:\nURL\n\nLicense:\nAttribution-NonCommercial 4.0 International\nIt should abide by the policy of OpenAI: URL" ]
db2c21846d0eccae8e9c331be3d434099d4e501d
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Traditional Chinese Medicine (TCM) Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination. - **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II
FreedomIntelligence/2023_Pharmacist_Licensure_Examination-TCM_track
[ "license:apache-2.0", "region:us" ]
2023-12-04T10:54:07+00:00
{"license": "apache-2.0"}
2023-12-04T11:39:31+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Traditional Chinese Medicine (TCM) Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination. - Repository: URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
21f02e1a8e05c9fa7273b675d5b8bdd1eb872181
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination. - **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II
FreedomIntelligence/2023_Pharmacist_Licensure_Examination-Pharmacy_track
[ "license:apache-2.0", "region:us" ]
2023-12-04T10:54:29+00:00
{"license": "apache-2.0"}
2023-12-04T11:39:08+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
The 2023 Chinese National Pharmacist Licensure Examination is divided into two distinct tracks: the Pharmacy track and the Traditional Chinese Medicine (TCM) Pharmacy track. The data provided here pertains to the Pharmacy track examination. It is important to note that this dataset was collected from online sources, and there may be some discrepancies between this data and the actual examination. - Repository: URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
43cc77a0fa37e26874274548a519a73f51296a48
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.3-preview - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.3-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.3-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.3-preview", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:02:12.262420](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.3-preview/blob/main/results_2023-12-04T11-02-12.262420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5468346216265638, "acc_stderr": 0.03408547225046011, "acc_norm": 0.5513258664220261, "acc_norm_stderr": 0.0348106640487534, "mc1": 0.37454100367197063, "mc1_stderr": 0.016943535128405324, "mc2": 0.5503392001553602, "mc2_stderr": 0.0150300089118087 }, "harness|arc:challenge|25": { "acc": 0.5255972696245734, "acc_stderr": 0.014592230885298962, "acc_norm": 0.53839590443686, "acc_norm_stderr": 0.014568245550296356 }, "harness|hellaswag|10": { "acc": 0.5545708026289584, "acc_stderr": 0.004959973514772515, "acc_norm": 0.7485560645289783, "acc_norm_stderr": 0.004329565016527315 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5789473684210527, "acc_stderr": 0.04017901275981749, "acc_norm": 0.5789473684210527, "acc_norm_stderr": 0.04017901275981749 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325582, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325582 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956913, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.49710982658959535, "acc_stderr": 0.03812400565974833, "acc_norm": 0.49710982658959535, "acc_norm_stderr": 0.03812400565974833 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.28431372549019607, "acc_stderr": 0.04488482852329017, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.04488482852329017 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4595744680851064, "acc_stderr": 0.032579014820998356, "acc_norm": 0.4595744680851064, "acc_norm_stderr": 0.032579014820998356 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.04161808503501531, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.04161808503501531 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3544973544973545, "acc_stderr": 0.024636830602842, "acc_norm": 0.3544973544973545, "acc_norm_stderr": 0.024636830602842 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3492063492063492, "acc_stderr": 0.042639068927951336, "acc_norm": 0.3492063492063492, "acc_norm_stderr": 0.042639068927951336 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.027430866579973467, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.027430866579973467 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4088669950738916, "acc_stderr": 0.034590588158832314, "acc_norm": 0.4088669950738916, "acc_norm_stderr": 0.034590588158832314 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6545454545454545, "acc_stderr": 0.03713158067481913, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.03713158067481913 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03191178226713548, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03191178226713548 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7357512953367875, "acc_stderr": 0.03182155050916644, "acc_norm": 0.7357512953367875, "acc_norm_stderr": 0.03182155050916644 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5256410256410257, "acc_stderr": 0.025317649726448663, "acc_norm": 0.5256410256410257, "acc_norm_stderr": 0.025317649726448663 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02831753349606647, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02831753349606647 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.542016806722689, "acc_stderr": 0.03236361111951941, "acc_norm": 0.542016806722689, "acc_norm_stderr": 0.03236361111951941 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7119266055045872, "acc_stderr": 0.019416445892636032, "acc_norm": 0.7119266055045872, "acc_norm_stderr": 0.019416445892636032 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6911764705882353, "acc_stderr": 0.03242661719827218, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.03242661719827218 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.030165137867847018, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.030165137867847018 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.032190792004199956, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.0436923632657398, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.0436923632657398 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6851851851851852, "acc_stderr": 0.04489931073591312, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.04489931073591312 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6441717791411042, "acc_stderr": 0.03761521380046734, "acc_norm": 0.6441717791411042, "acc_norm_stderr": 0.03761521380046734 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.046897659372781335, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.046897659372781335 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8418803418803419, "acc_stderr": 0.023902325549560396, "acc_norm": 0.8418803418803419, "acc_norm_stderr": 0.023902325549560396 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7496807151979565, "acc_stderr": 0.015491088951494569, "acc_norm": 0.7496807151979565, "acc_norm_stderr": 0.015491088951494569 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5924855491329479, "acc_stderr": 0.026454578146931505, "acc_norm": 0.5924855491329479, "acc_norm_stderr": 0.026454578146931505 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24692737430167597, "acc_stderr": 0.01442229220480886, "acc_norm": 0.24692737430167597, "acc_norm_stderr": 0.01442229220480886 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6209150326797386, "acc_stderr": 0.027780141207023334, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.027780141207023334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6012861736334405, "acc_stderr": 0.0278093225857745, "acc_norm": 0.6012861736334405, "acc_norm_stderr": 0.0278093225857745 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5895061728395061, "acc_stderr": 0.027371350925124764, "acc_norm": 0.5895061728395061, "acc_norm_stderr": 0.027371350925124764 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.35106382978723405, "acc_stderr": 0.028473501272963764, "acc_norm": 0.35106382978723405, "acc_norm_stderr": 0.028473501272963764 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39895697522816165, "acc_stderr": 0.012506757655293665, "acc_norm": 0.39895697522816165, "acc_norm_stderr": 0.012506757655293665 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5735294117647058, "acc_stderr": 0.030042615832714864, "acc_norm": 0.5735294117647058, "acc_norm_stderr": 0.030042615832714864 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5228758169934641, "acc_stderr": 0.02020665318788479, "acc_norm": 0.5228758169934641, "acc_norm_stderr": 0.02020665318788479 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6408163265306123, "acc_stderr": 0.030713560455108493, "acc_norm": 0.6408163265306123, "acc_norm_stderr": 0.030713560455108493 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7412935323383084, "acc_stderr": 0.030965903123573044, "acc_norm": 0.7412935323383084, "acc_norm_stderr": 0.030965903123573044 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.034462962170884265, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.034462962170884265 }, "harness|truthfulqa:mc|0": { "mc1": 0.37454100367197063, "mc1_stderr": 0.016943535128405324, "mc2": 0.5503392001553602, "mc2_stderr": 0.0150300089118087 }, "harness|winogrande|5": { "acc": 0.7458563535911602, "acc_stderr": 0.012236307219708266 }, "harness|gsm8k|5": { "acc": 0.2896133434420015, "acc_stderr": 0.012493927348659629 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.3-preview
[ "region:us" ]
2023-12-04T11:05:04+00:00
{"pretty_name": "Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview", "dataset_summary": "Dataset automatically created during the evaluation run of model [WebraftAI/synapsellm-7b-mistral-v0.3-preview](https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.3-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.3-preview\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:02:12.262420](https://huggingface.co/datasets/open-llm-leaderboard/details_WebraftAI__synapsellm-7b-mistral-v0.3-preview/blob/main/results_2023-12-04T11-02-12.262420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5468346216265638,\n \"acc_stderr\": 0.03408547225046011,\n \"acc_norm\": 0.5513258664220261,\n \"acc_norm_stderr\": 0.0348106640487534,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5503392001553602,\n \"mc2_stderr\": 0.0150300089118087\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.014568245550296356\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5545708026289584,\n \"acc_stderr\": 0.004959973514772515,\n \"acc_norm\": 0.7485560645289783,\n \"acc_norm_stderr\": 0.004329565016527315\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.04161808503501531,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.04161808503501531\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\": 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.042639068927951336,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.042639068927951336\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713548,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713548\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916644,\n \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636032,\n \"acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636032\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847018,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n \"acc_stderr\": 0.015491088951494569,\n \"acc_norm\": 0.7496807151979565,\n \"acc_norm_stderr\": 0.015491088951494569\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.01442229220480886,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.01442229220480886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023334,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39895697522816165,\n \"acc_stderr\": 0.012506757655293665,\n \"acc_norm\": 0.39895697522816165,\n \"acc_norm_stderr\": 0.012506757655293665\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714864,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714864\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.02020665318788479,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.02020665318788479\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573044,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5503392001553602,\n \"mc2_stderr\": 0.0150300089118087\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708266\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2896133434420015,\n \"acc_stderr\": 0.012493927348659629\n }\n}\n```", "repo_url": "https://huggingface.co/WebraftAI/synapsellm-7b-mistral-v0.3-preview", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-02-12.262420.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["**/details_harness|winogrande|5_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-02-12.262420.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_02_12.262420", "path": ["results_2023-12-04T11-02-12.262420.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-02-12.262420.parquet"]}]}]}
2023-12-04T11:05:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.3-preview on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:02:12.262420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.3-preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:02:12.262420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.3-preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:02:12.262420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 178, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WebraftAI/synapsellm-7b-mistral-v0.3-preview## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WebraftAI/synapsellm-7b-mistral-v0.3-preview on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:02:12.262420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1ce47ce3d719792c74e2b14495ee94b603fd7d18
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-flan - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-100step-flan](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-flan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-flan", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:18:09.449875](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-flan/blob/main/results_2023-12-04T11-18-09.449875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4851611654629593, "acc_stderr": 0.03426525263666884, "acc_norm": 0.4898062511997487, "acc_norm_stderr": 0.03501503528186299, "mc1": 0.30354957160342716, "mc1_stderr": 0.016095884155386847, "mc2": 0.4566539974272441, "mc2_stderr": 0.01564839684478776 }, "harness|arc:challenge|25": { "acc": 0.4948805460750853, "acc_stderr": 0.01461062489030916, "acc_norm": 0.5290102389078498, "acc_norm_stderr": 0.014586776355294323 }, "harness|hellaswag|10": { "acc": 0.5961959768970324, "acc_stderr": 0.004896563126116811, "acc_norm": 0.7844054969129656, "acc_norm_stderr": 0.004103936879526262 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.04266763404099582, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.48026315789473684, "acc_stderr": 0.040657710025626036, "acc_norm": 0.48026315789473684, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5433962264150943, "acc_stderr": 0.03065674869673943, "acc_norm": 0.5433962264150943, "acc_norm_stderr": 0.03065674869673943 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5208333333333334, "acc_stderr": 0.041775789507399935, "acc_norm": 0.5208333333333334, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4046242774566474, "acc_stderr": 0.03742461193887248, "acc_norm": 0.4046242774566474, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4085106382978723, "acc_stderr": 0.03213418026701576, "acc_norm": 0.4085106382978723, "acc_norm_stderr": 0.03213418026701576 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.045595221419582166, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.045595221419582166 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.041665675771015785, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.023517294335963286, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.023517294335963286 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.24603174603174602, "acc_stderr": 0.03852273364924314, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.03852273364924314 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5225806451612903, "acc_stderr": 0.02841498501970786, "acc_norm": 0.5225806451612903, "acc_norm_stderr": 0.02841498501970786 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3645320197044335, "acc_stderr": 0.033864057460620905, "acc_norm": 0.3645320197044335, "acc_norm_stderr": 0.033864057460620905 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5878787878787879, "acc_stderr": 0.03843566993588717, "acc_norm": 0.5878787878787879, "acc_norm_stderr": 0.03843566993588717 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6060606060606061, "acc_stderr": 0.034812853382329624, "acc_norm": 0.6060606060606061, "acc_norm_stderr": 0.034812853382329624 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7150259067357513, "acc_stderr": 0.032577140777096614, "acc_norm": 0.7150259067357513, "acc_norm_stderr": 0.032577140777096614 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4256410256410256, "acc_stderr": 0.02506909438729654, "acc_norm": 0.4256410256410256, "acc_norm_stderr": 0.02506909438729654 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.02684205787383371, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.02684205787383371 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.42436974789915966, "acc_stderr": 0.03210479051015776, "acc_norm": 0.42436974789915966, "acc_norm_stderr": 0.03210479051015776 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.036848815213890225, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.036848815213890225 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6770642201834862, "acc_stderr": 0.02004811592341532, "acc_norm": 0.6770642201834862, "acc_norm_stderr": 0.02004811592341532 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.0321495214780275, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.0321495214780275 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03308611113236434, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03308611113236434 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.679324894514768, "acc_stderr": 0.030381931949990403, "acc_norm": 0.679324894514768, "acc_norm_stderr": 0.030381931949990403 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5725190839694656, "acc_stderr": 0.04338920305792401, "acc_norm": 0.5725190839694656, "acc_norm_stderr": 0.04338920305792401 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6363636363636364, "acc_stderr": 0.043913262867240704, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.043913262867240704 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6018518518518519, "acc_stderr": 0.04732332615978813, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.04732332615978813 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.558282208588957, "acc_stderr": 0.03901591825836184, "acc_norm": 0.558282208588957, "acc_norm_stderr": 0.03901591825836184 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.04656147110012351, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.04656147110012351 }, "harness|hendrycksTest-marketing|5": { "acc": 0.717948717948718, "acc_stderr": 0.029480360549541194, "acc_norm": 0.717948717948718, "acc_norm_stderr": 0.029480360549541194 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6768837803320562, "acc_stderr": 0.016723726512343048, "acc_norm": 0.6768837803320562, "acc_norm_stderr": 0.016723726512343048 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5173410404624278, "acc_stderr": 0.02690290045866664, "acc_norm": 0.5173410404624278, "acc_norm_stderr": 0.02690290045866664 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2212290502793296, "acc_stderr": 0.013882164598887275, "acc_norm": 0.2212290502793296, "acc_norm_stderr": 0.013882164598887275 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5163398692810458, "acc_stderr": 0.02861462475280544, "acc_norm": 0.5163398692810458, "acc_norm_stderr": 0.02861462475280544 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5659163987138264, "acc_stderr": 0.02815023224453559, "acc_norm": 0.5659163987138264, "acc_norm_stderr": 0.02815023224453559 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.0275860062216077, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.0275860062216077 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36879432624113473, "acc_stderr": 0.02878222756134724, "acc_norm": 0.36879432624113473, "acc_norm_stderr": 0.02878222756134724 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.34876140808344197, "acc_stderr": 0.01217203515712712, "acc_norm": 0.34876140808344197, "acc_norm_stderr": 0.01217203515712712 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.45955882352941174, "acc_stderr": 0.03027332507734576, "acc_norm": 0.45955882352941174, "acc_norm_stderr": 0.03027332507734576 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.48366013071895425, "acc_stderr": 0.020217030653186453, "acc_norm": 0.48366013071895425, "acc_norm_stderr": 0.020217030653186453 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5224489795918368, "acc_stderr": 0.031976941187136725, "acc_norm": 0.5224489795918368, "acc_norm_stderr": 0.031976941187136725 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6467661691542289, "acc_stderr": 0.03379790611796777, "acc_norm": 0.6467661691542289, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-virology|5": { "acc": 0.43373493975903615, "acc_stderr": 0.03858158940685517, "acc_norm": 0.43373493975903615, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.034240429246915824, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.034240429246915824 }, "harness|truthfulqa:mc|0": { "mc1": 0.30354957160342716, "mc1_stderr": 0.016095884155386847, "mc2": 0.4566539974272441, "mc2_stderr": 0.01564839684478776 }, "harness|winogrande|5": { "acc": 0.7237569060773481, "acc_stderr": 0.012566815015698158 }, "harness|gsm8k|5": { "acc": 0.19484457922668688, "acc_stderr": 0.010910039409578768 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-flan
[ "region:us" ]
2023-12-04T11:21:05+00:00
{"pretty_name": "Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan", "dataset_summary": "Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-100step-flan](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-flan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-flan\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:18:09.449875](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-100step-flan/blob/main/results_2023-12-04T11-18-09.449875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4851611654629593,\n \"acc_stderr\": 0.03426525263666884,\n \"acc_norm\": 0.4898062511997487,\n \"acc_norm_stderr\": 0.03501503528186299,\n \"mc1\": 0.30354957160342716,\n \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.4566539974272441,\n \"mc2_stderr\": 0.01564839684478776\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.01461062489030916,\n \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.014586776355294323\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5961959768970324,\n \"acc_stderr\": 0.004896563126116811,\n \"acc_norm\": 0.7844054969129656,\n \"acc_norm_stderr\": 0.004103936879526262\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.02506909438729654,\n \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.02506909438729654\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341532,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341532\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990403,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990403\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n \"acc_stderr\": 0.013882164598887275,\n \"acc_norm\": 0.2212290502793296,\n \"acc_norm_stderr\": 0.013882164598887275\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n \"acc_stderr\": 0.02815023224453559,\n \"acc_norm\": 0.5659163987138264,\n \"acc_norm_stderr\": 0.02815023224453559\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.0275860062216077,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.0275860062216077\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34876140808344197,\n \"acc_stderr\": 0.01217203515712712,\n \"acc_norm\": 0.34876140808344197,\n \"acc_norm_stderr\": 0.01217203515712712\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734576,\n \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186453,\n \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186453\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.4566539974272441,\n \"mc2_stderr\": 0.01564839684478776\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7237569060773481,\n \"acc_stderr\": 0.012566815015698158\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19484457922668688,\n \"acc_stderr\": 0.010910039409578768\n }\n}\n```", "repo_url": "https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-100step-flan", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-18-09.449875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["**/details_harness|winogrande|5_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-18-09.449875.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_18_09.449875", "path": ["results_2023-12-04T11-18-09.449875.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-18-09.449875.parquet"]}]}]}
2023-12-04T11:21:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-100step-flan on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:18:09.449875(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-100step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:18:09.449875(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-100step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:18:09.449875(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-100step-flan## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-100step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:18:09.449875(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
d943b260119ec832929d92da33a559f1d0575bea
# Dataset Card for Evaluation run of abacusai/Giraffe-beta-13b-32k ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/abacusai/Giraffe-beta-13b-32k - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [abacusai/Giraffe-beta-13b-32k](https://huggingface.co/abacusai/Giraffe-beta-13b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_abacusai__Giraffe-beta-13b-32k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:23:46.573377](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Giraffe-beta-13b-32k/blob/main/results_2023-12-04T11-23-46.573377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5350938644943618, "acc_stderr": 0.03417616181346088, "acc_norm": 0.5410188997846813, "acc_norm_stderr": 0.03491906704101023, "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.4258338965668367, "mc2_stderr": 0.014377658114706975 }, "harness|arc:challenge|25": { "acc": 0.5204778156996587, "acc_stderr": 0.014599131353035009, "acc_norm": 0.5563139931740614, "acc_norm_stderr": 0.01451842182567045 }, "harness|hellaswag|10": { "acc": 0.6017725552678749, "acc_stderr": 0.004885323175701676, "acc_norm": 0.8042222664807808, "acc_norm_stderr": 0.003959872578165267 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5526315789473685, "acc_stderr": 0.0404633688397825, "acc_norm": 0.5526315789473685, "acc_norm_stderr": 0.0404633688397825 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5584905660377358, "acc_stderr": 0.03056159042673183, "acc_norm": 0.5584905660377358, "acc_norm_stderr": 0.03056159042673183 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5347222222222222, "acc_stderr": 0.04171115858181618, "acc_norm": 0.5347222222222222, "acc_norm_stderr": 0.04171115858181618 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.47398843930635837, "acc_stderr": 0.038073017265045125, "acc_norm": 0.47398843930635837, "acc_norm_stderr": 0.038073017265045125 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.03227834510146268, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.04372748290278007, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.04372748290278007 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.041665675771015785, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.041665675771015785 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.31216931216931215, "acc_stderr": 0.0238652068369726, "acc_norm": 0.31216931216931215, "acc_norm_stderr": 0.0238652068369726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6064516129032258, "acc_stderr": 0.027791878753132267, "acc_norm": 0.6064516129032258, "acc_norm_stderr": 0.027791878753132267 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.37438423645320196, "acc_stderr": 0.03405155380561953, "acc_norm": 0.37438423645320196, "acc_norm_stderr": 0.03405155380561953 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6, "acc_stderr": 0.03825460278380026, "acc_norm": 0.6, "acc_norm_stderr": 0.03825460278380026 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6161616161616161, "acc_stderr": 0.03464881675016341, "acc_norm": 0.6161616161616161, "acc_norm_stderr": 0.03464881675016341 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7564766839378239, "acc_stderr": 0.030975436386845436, "acc_norm": 0.7564766839378239, "acc_norm_stderr": 0.030975436386845436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4794871794871795, "acc_stderr": 0.02532966316348994, "acc_norm": 0.4794871794871795, "acc_norm_stderr": 0.02532966316348994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.0324371805513741, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.0324371805513741 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526733, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526733 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.708256880733945, "acc_stderr": 0.019489300968876508, "acc_norm": 0.708256880733945, "acc_norm_stderr": 0.019489300968876508 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.030165137867847, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.030165137867847 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.03227790442850499, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.03227790442850499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5419847328244275, "acc_stderr": 0.04369802690578756, "acc_norm": 0.5419847328244275, "acc_norm_stderr": 0.04369802690578756 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6203703703703703, "acc_stderr": 0.04691521224077742, "acc_norm": 0.6203703703703703, "acc_norm_stderr": 0.04691521224077742 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6257668711656442, "acc_stderr": 0.03802068102899615, "acc_norm": 0.6257668711656442, "acc_norm_stderr": 0.03802068102899615 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.0432704093257873, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.0432704093257873 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.04750458399041696, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.04750458399041696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7478632478632479, "acc_stderr": 0.02844796547623102, "acc_norm": 0.7478632478632479, "acc_norm_stderr": 0.02844796547623102 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7049808429118773, "acc_stderr": 0.01630836377293273, "acc_norm": 0.7049808429118773, "acc_norm_stderr": 0.01630836377293273 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.02626167760780665, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.02626167760780665 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36983240223463687, "acc_stderr": 0.016145881256056215, "acc_norm": 0.36983240223463687, "acc_norm_stderr": 0.016145881256056215 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6013071895424836, "acc_stderr": 0.028036092273891776, "acc_norm": 0.6013071895424836, "acc_norm_stderr": 0.028036092273891776 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.617363344051447, "acc_stderr": 0.02760468902858199, "acc_norm": 0.617363344051447, "acc_norm_stderr": 0.02760468902858199 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132143, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132143 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3983050847457627, "acc_stderr": 0.012503310565166244, "acc_norm": 0.3983050847457627, "acc_norm_stderr": 0.012503310565166244 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5408496732026143, "acc_stderr": 0.020160213617222516, "acc_norm": 0.5408496732026143, "acc_norm_stderr": 0.020160213617222516 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.046075820907199756, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.046075820907199756 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6448979591836734, "acc_stderr": 0.030635655150387638, "acc_norm": 0.6448979591836734, "acc_norm_stderr": 0.030635655150387638 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.03115715086935555, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.03115715086935555 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7426900584795322, "acc_stderr": 0.03352799844161865, "acc_norm": 0.7426900584795322, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.4258338965668367, "mc2_stderr": 0.014377658114706975 }, "harness|winogrande|5": { "acc": 0.7458563535911602, "acc_stderr": 0.012236307219708267 }, "harness|gsm8k|5": { "acc": 0.21304018195602728, "acc_stderr": 0.01127844785690078 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_abacusai__Giraffe-beta-13b-32k
[ "region:us" ]
2023-12-04T11:26:41+00:00
{"pretty_name": "Evaluation run of abacusai/Giraffe-beta-13b-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/Giraffe-beta-13b-32k](https://huggingface.co/abacusai/Giraffe-beta-13b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Giraffe-beta-13b-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:23:46.573377](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Giraffe-beta-13b-32k/blob/main/results_2023-12-04T11-23-46.573377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5350938644943618,\n \"acc_stderr\": 0.03417616181346088,\n \"acc_norm\": 0.5410188997846813,\n \"acc_norm_stderr\": 0.03491906704101023,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4258338965668367,\n \"mc2_stderr\": 0.014377658114706975\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.014599131353035009,\n \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.01451842182567045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6017725552678749,\n \"acc_stderr\": 0.004885323175701676,\n \"acc_norm\": 0.8042222664807808,\n \"acc_norm_stderr\": 0.003959872578165267\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.03056159042673183,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.03056159042673183\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31216931216931215,\n \"acc_stderr\": 0.0238652068369726,\n \"acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.0238652068369726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132267,\n \"acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561953,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561953\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016341,\n \"acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016341\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.02532966316348994,\n \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.02532966316348994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.708256880733945,\n \"acc_stderr\": 0.019489300968876508,\n \"acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.019489300968876508\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7049808429118773,\n \"acc_stderr\": 0.01630836377293273,\n \"acc_norm\": 0.7049808429118773,\n \"acc_norm_stderr\": 0.01630836377293273\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.02626167760780665,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.02626167760780665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.02760468902858199,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.02760468902858199\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n \"acc_stderr\": 0.012503310565166244,\n \"acc_norm\": 0.3983050847457627,\n \"acc_norm_stderr\": 0.012503310565166244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935555,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935555\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.4258338965668367,\n \"mc2_stderr\": 0.014377658114706975\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21304018195602728,\n \"acc_stderr\": 0.01127844785690078\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/Giraffe-beta-13b-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-23-46.573377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["**/details_harness|winogrande|5_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-23-46.573377.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_23_46.573377", "path": ["results_2023-12-04T11-23-46.573377.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-23-46.573377.parquet"]}]}]}
2023-12-04T11:27:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of abacusai/Giraffe-beta-13b-32k ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model abacusai/Giraffe-beta-13b-32k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:23:46.573377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of abacusai/Giraffe-beta-13b-32k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abacusai/Giraffe-beta-13b-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:23:46.573377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of abacusai/Giraffe-beta-13b-32k", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model abacusai/Giraffe-beta-13b-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:23:46.573377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abacusai/Giraffe-beta-13b-32k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model abacusai/Giraffe-beta-13b-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:23:46.573377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2221cf6f3a0223985903898d798b8c09f429ea49
# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Weyaxi/Platypus-Nebula-v2-7B](https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:25:54.972492](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B/blob/main/results_2023-12-04T11-25-54.972492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5564269677143451, "acc_stderr": 0.03374811024697019, "acc_norm": 0.5651516514420288, "acc_norm_stderr": 0.03451915951620442, "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113502, "mc2": 0.4693887506938676, "mc2_stderr": 0.015134250861855079 }, "harness|arc:challenge|25": { "acc": 0.5213310580204779, "acc_stderr": 0.014598087973127108, "acc_norm": 0.5537542662116041, "acc_norm_stderr": 0.014526705548539982 }, "harness|hellaswag|10": { "acc": 0.6377215694084843, "acc_stderr": 0.004796763521045228, "acc_norm": 0.8302131049591714, "acc_norm_stderr": 0.003746781712509652 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5333333333333333, "acc_stderr": 0.043097329010363554, "acc_norm": 0.5333333333333333, "acc_norm_stderr": 0.043097329010363554 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.04026097083296564, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.04026097083296564 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5584905660377358, "acc_stderr": 0.030561590426731833, "acc_norm": 0.5584905660377358, "acc_norm_stderr": 0.030561590426731833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6319444444444444, "acc_stderr": 0.04032999053960718, "acc_norm": 0.6319444444444444, "acc_norm_stderr": 0.04032999053960718 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5664739884393064, "acc_stderr": 0.03778621079092055, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105654, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.03268335899936337, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.03268335899936337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137595, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137595 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302844, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302844 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.42857142857142855, "acc_stderr": 0.03481904844438804, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.03481904844438804 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.035679697722680495, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.035679697722680495 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7202072538860104, "acc_stderr": 0.03239637046735704, "acc_norm": 0.7202072538860104, "acc_norm_stderr": 0.03239637046735704 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5282051282051282, "acc_stderr": 0.025310639254933882, "acc_norm": 0.5282051282051282, "acc_norm_stderr": 0.025310639254933882 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.027080372815145654, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.027080372815145654 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5378151260504201, "acc_stderr": 0.0323854694875898, "acc_norm": 0.5378151260504201, "acc_norm_stderr": 0.0323854694875898 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.0386155754625517, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.0386155754625517 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.01726674208763079, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.01726674208763079 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7009803921568627, "acc_stderr": 0.03213325717373617, "acc_norm": 0.7009803921568627, "acc_norm_stderr": 0.03213325717373617 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7383966244725738, "acc_stderr": 0.028609516716994934, "acc_norm": 0.7383966244725738, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6870229007633588, "acc_stderr": 0.04066962905677697, "acc_norm": 0.6870229007633588, "acc_norm_stderr": 0.04066962905677697 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.039849796533028725, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.039849796533028725 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6574074074074074, "acc_stderr": 0.045879047413018105, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.045879047413018105 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6871165644171779, "acc_stderr": 0.036429145782924055, "acc_norm": 0.6871165644171779, "acc_norm_stderr": 0.036429145782924055 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04547960999764376, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04547960999764376 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.025819233256483717, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.025819233256483717 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7790549169859514, "acc_stderr": 0.01483620516733356, "acc_norm": 0.7790549169859514, "acc_norm_stderr": 0.01483620516733356 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5895953757225434, "acc_stderr": 0.02648339204209818, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.02648339204209818 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3206703910614525, "acc_stderr": 0.015609929559348402, "acc_norm": 0.3206703910614525, "acc_norm_stderr": 0.015609929559348402 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6045751633986928, "acc_stderr": 0.027996723180631438, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.027996723180631438 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399662, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399662 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6790123456790124, "acc_stderr": 0.02597656601086274, "acc_norm": 0.6790123456790124, "acc_norm_stderr": 0.02597656601086274 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.42907801418439717, "acc_stderr": 0.02952591430255856, "acc_norm": 0.42907801418439717, "acc_norm_stderr": 0.02952591430255856 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4452411994784876, "acc_stderr": 0.012693421303973294, "acc_norm": 0.4452411994784876, "acc_norm_stderr": 0.012693421303973294 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5919117647058824, "acc_stderr": 0.029855261393483924, "acc_norm": 0.5919117647058824, "acc_norm_stderr": 0.029855261393483924 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5522875816993464, "acc_stderr": 0.020116925347422425, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.020116925347422425 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.046534298079135075, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.046534298079135075 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5020408163265306, "acc_stderr": 0.0320089533497105, "acc_norm": 0.5020408163265306, "acc_norm_stderr": 0.0320089533497105 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.03152439186555404, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.03152439186555404 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.040201512610368445, "acc_norm": 0.8, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.03887971849597264, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.032744852119469564, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.032744852119469564 }, "harness|truthfulqa:mc|0": { "mc1": 0.3268053855569155, "mc1_stderr": 0.01641987473113502, "mc2": 0.4693887506938676, "mc2_stderr": 0.015134250861855079 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.01258891818387159 }, "harness|gsm8k|5": { "acc": 0.10083396512509477, "acc_stderr": 0.008294031192126605 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B
[ "region:us" ]
2023-12-04T11:28:45+00:00
{"pretty_name": "Evaluation run of Weyaxi/Platypus-Nebula-v2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Platypus-Nebula-v2-7B](https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:25:54.972492](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B/blob/main/results_2023-12-04T11-25-54.972492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5564269677143451,\n \"acc_stderr\": 0.03374811024697019,\n \"acc_norm\": 0.5651516514420288,\n \"acc_norm_stderr\": 0.03451915951620442,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113502,\n \"mc2\": 0.4693887506938676,\n \"mc2_stderr\": 0.015134250861855079\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127108,\n \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6377215694084843,\n \"acc_stderr\": 0.004796763521045228,\n \"acc_norm\": 0.8302131049591714,\n \"acc_norm_stderr\": 0.003746781712509652\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936337,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933882,\n \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933882\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145654,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145654\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.0323854694875898,\n \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.0323854694875898\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.01483620516733356,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.01483620516733356\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.015609929559348402,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.015609929559348402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631438,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631438\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.01641987473113502,\n \"mc2\": 0.4693887506938676,\n \"mc2_stderr\": 0.015134250861855079\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.01258891818387159\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \"acc_stderr\": 0.008294031192126605\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["**/details_harness|winogrande|5_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-25-54.972492.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_25_54.972492", "path": ["results_2023-12-04T11-25-54.972492.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-25-54.972492.parquet"]}]}]}
2023-12-04T11:29:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Weyaxi/Platypus-Nebula-v2-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:25:54.972492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/Platypus-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:25:54.972492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/Platypus-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:25:54.972492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/Platypus-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:25:54.972492(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bdf752d4359e0ef7c0f757183334c07e28da9c67
# Create dataset for arabic named entiy
HassenMh/ArabNER
[ "license:cc-by-nc-4.0", "region:us" ]
2023-12-04T11:39:03+00:00
{"license": "cc-by-nc-4.0"}
2024-01-26T09:24:57+00:00
[]
[]
TAGS #license-cc-by-nc-4.0 #region-us
# Create dataset for arabic named entiy
[ "# Create dataset for arabic named entiy" ]
[ "TAGS\n#license-cc-by-nc-4.0 #region-us \n", "# Create dataset for arabic named entiy" ]
[ 17, 11 ]
[ "passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n# Create dataset for arabic named entiy" ]
1f1568cf9320a2c7a1d806ea2ccfaa4036d1fe40
The RTTM files are generated by executing the `libri_css` recipe from `ESPnet` (https://github.com/espnet/espnet/tree/master/egs/libri_css/asr1).
boeddeker/espnet_libri_css_diarize_spectral_rttm
[ "license:mit", "region:us" ]
2023-12-04T11:50:37+00:00
{"license": "mit"}
2023-12-04T11:54:56+00:00
[]
[]
TAGS #license-mit #region-us
The RTTM files are generated by executing the 'libri_css' recipe from 'ESPnet' (URL
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
e12a7202d71d391e2120c367a508665f09174bb8
# Dataset Card for Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralHermes-2.5-Mistral-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:51:15.903375](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T11-51-15.903375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6355941629435156, "acc_stderr": 0.03230794527671688, "acc_norm": 0.637383178314279, "acc_norm_stderr": 0.03295364147633226, "mc1": 0.3769889840881273, "mc1_stderr": 0.01696551757893035, "mc2": 0.5492935605551788, "mc2_stderr": 0.015342233606445274 }, "harness|arc:challenge|25": { "acc": 0.6245733788395904, "acc_stderr": 0.014150631435111726, "acc_norm": 0.6655290102389079, "acc_norm_stderr": 0.013787460322441374 }, "harness|hellaswag|10": { "acc": 0.6593308105954989, "acc_stderr": 0.004729656826803945, "acc_norm": 0.8490340569607648, "acc_norm_stderr": 0.003572839969521997 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082635, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082635 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728762, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728762 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42592592592592593, "acc_stderr": 0.02546714904546955, "acc_norm": 0.42592592592592593, "acc_norm_stderr": 0.02546714904546955 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642507, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642507 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919436, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6051282051282051, "acc_stderr": 0.0247843169421564, "acc_norm": 0.6051282051282051, "acc_norm_stderr": 0.0247843169421564 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.03077805742293167, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.03077805742293167 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976037, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976037 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.028626547912437413, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.028626547912437413 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7040358744394619, "acc_stderr": 0.030636591348699796, "acc_norm": 0.7040358744394619, "acc_norm_stderr": 0.030636591348699796 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594654, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594654 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.031570650789119005, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.031570650789119005 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.02250903393707781, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.02250903393707781 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.822477650063857, "acc_stderr": 0.013664230995834834, "acc_norm": 0.822477650063857, "acc_norm_stderr": 0.013664230995834834 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3128491620111732, "acc_stderr": 0.01550689259464726, "acc_norm": 0.3128491620111732, "acc_norm_stderr": 0.01550689259464726 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7516339869281046, "acc_stderr": 0.02473998135511359, "acc_norm": 0.7516339869281046, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600713, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600713 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653342, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653342 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396546, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396546 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687492, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687492 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142773, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142773 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786855, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786855 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3769889840881273, "mc1_stderr": 0.01696551757893035, "mc2": 0.5492935605551788, "mc2_stderr": 0.015342233606445274 }, "harness|winogrande|5": { "acc": 0.7829518547750592, "acc_stderr": 0.01158587171020941 }, "harness|gsm8k|5": { "acc": 0.6133434420015162, "acc_stderr": 0.013413955095965309 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_mlabonne__NeuralHermes-2.5-Mistral-7B
[ "region:us" ]
2023-12-04T11:54:06+00:00
{"pretty_name": "Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:51:15.903375](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralHermes-2.5-Mistral-7B/blob/main/results_2023-12-04T11-51-15.903375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355941629435156,\n \"acc_stderr\": 0.03230794527671688,\n \"acc_norm\": 0.637383178314279,\n \"acc_norm_stderr\": 0.03295364147633226,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5492935605551788,\n \"mc2_stderr\": 0.015342233606445274\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6593308105954989,\n \"acc_stderr\": 0.004729656826803945,\n \"acc_norm\": 0.8490340569607648,\n \"acc_norm_stderr\": 0.003572839969521997\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437413,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437413\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n \"acc_stderr\": 0.01550689259464726,\n \"acc_norm\": 0.3128491620111732,\n \"acc_norm_stderr\": 0.01550689259464726\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653342,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653342\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396546,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396546\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5492935605551788,\n \"mc2_stderr\": 0.015342233606445274\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \"acc_stderr\": 0.013413955095965309\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-51-15.903375.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["**/details_harness|winogrande|5_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-51-15.903375.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_51_15.903375", "path": ["results_2023-12-04T11-51-15.903375.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-51-15.903375.parquet"]}]}]}
2023-12-04T11:54:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model mlabonne/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:51:15.903375(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:51:15.903375(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:51:15.903375(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/NeuralHermes-2.5-Mistral-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:51:15.903375(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c40b33e3d87d67c8de45262343d68fcfbe676704
# Dataset Card for Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/migtissera/Tess-XS-v1-3-yarn-128K - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [migtissera/Tess-XS-v1-3-yarn-128K](https://huggingface.co/migtissera/Tess-XS-v1-3-yarn-128K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-XS-v1-3-yarn-128K", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T20:29:13.778100](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1-3-yarn-128K/blob/main/results_2023-12-04T20-29-13.778100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6198767310043558, "acc_stderr": 0.03278732745349028, "acc_norm": 0.6244134729161351, "acc_norm_stderr": 0.03344031108631591, "mc1": 0.3488372093023256, "mc1_stderr": 0.016684419859986893, "mc2": 0.5019574783346394, "mc2_stderr": 0.015769313109990427 }, "harness|arc:challenge|25": { "acc": 0.5861774744027304, "acc_stderr": 0.014392730009221007, "acc_norm": 0.6160409556313993, "acc_norm_stderr": 0.01421244498065189 }, "harness|hellaswag|10": { "acc": 0.6504680342561243, "acc_stderr": 0.004758476684324042, "acc_norm": 0.8296156144194383, "acc_norm_stderr": 0.003752017639083751 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.038781398887976104, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.038781398887976104 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.02914690474779833, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.02914690474779833 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383887, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383887 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.032321469162244695, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.032321469162244695 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055263, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055263 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7483870967741936, "acc_stderr": 0.024685979286239952, "acc_norm": 0.7483870967741936, "acc_norm_stderr": 0.024685979286239952 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.46798029556650245, "acc_stderr": 0.03510766597959217, "acc_norm": 0.46798029556650245, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03053289223393202, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03053289223393202 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121437, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6076923076923076, "acc_stderr": 0.024756000382130956, "acc_norm": 0.6076923076923076, "acc_norm_stderr": 0.024756000382130956 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.029185714949857403, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.029185714949857403 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.01697028909045803, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.01697028909045803 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044811, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044811 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588667, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588667 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601446, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822915, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822915 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728744, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728744 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516302, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516302 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.03462419931615624, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.03462419931615624 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8247863247863247, "acc_stderr": 0.02490443909891823, "acc_norm": 0.8247863247863247, "acc_norm_stderr": 0.02490443909891823 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.80970625798212, "acc_stderr": 0.014036945850381394, "acc_norm": 0.80970625798212, "acc_norm_stderr": 0.014036945850381394 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4201117318435754, "acc_stderr": 0.016507671073256402, "acc_norm": 0.4201117318435754, "acc_norm_stderr": 0.016507671073256402 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.024848018263875206, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.024848018263875206 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6851851851851852, "acc_stderr": 0.025842248700902168, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.025842248700902168 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.439374185136897, "acc_stderr": 0.012676014778580214, "acc_norm": 0.439374185136897, "acc_norm_stderr": 0.012676014778580214 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6360294117647058, "acc_stderr": 0.029227192460032025, "acc_norm": 0.6360294117647058, "acc_norm_stderr": 0.029227192460032025 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6421568627450981, "acc_stderr": 0.019393058402355442, "acc_norm": 0.6421568627450981, "acc_norm_stderr": 0.019393058402355442 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7142857142857143, "acc_stderr": 0.028920583220675596, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.028920583220675596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.3488372093023256, "mc1_stderr": 0.016684419859986893, "mc2": 0.5019574783346394, "mc2_stderr": 0.015769313109990427 }, "harness|winogrande|5": { "acc": 0.7474348855564326, "acc_stderr": 0.012211148449394105 }, "harness|gsm8k|5": { "acc": 0.4336618650492798, "acc_stderr": 0.013650728047064692 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_migtissera__Tess-XS-v1-3-yarn-128K
[ "region:us" ]
2023-12-04T11:57:41+00:00
{"pretty_name": "Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-XS-v1-3-yarn-128K](https://huggingface.co/migtissera/Tess-XS-v1-3-yarn-128K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-XS-v1-3-yarn-128K\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T20:29:13.778100](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1-3-yarn-128K/blob/main/results_2023-12-04T20-29-13.778100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6198767310043558,\n \"acc_stderr\": 0.03278732745349028,\n \"acc_norm\": 0.6244134729161351,\n \"acc_norm_stderr\": 0.03344031108631591,\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.5019574783346394,\n \"mc2_stderr\": 0.015769313109990427\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221007,\n \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6504680342561243,\n \"acc_stderr\": 0.004758476684324042,\n \"acc_norm\": 0.8296156144194383,\n \"acc_norm_stderr\": 0.003752017639083751\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.032321469162244695,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.032321469162244695\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239952,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n \"acc_stderr\": 0.02490443909891823,\n \"acc_norm\": 0.8247863247863247,\n \"acc_norm_stderr\": 0.02490443909891823\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381394,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381394\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875206,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875206\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580214,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580214\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355442,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355442\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675596,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.5019574783346394,\n \"mc2_stderr\": 0.015769313109990427\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4336618650492798,\n \"acc_stderr\": 0.013650728047064692\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-XS-v1-3-yarn-128K", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-54-49.331822.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T20-29-13.778100.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["**/details_harness|winogrande|5_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["**/details_harness|winogrande|5_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T20-29-13.778100.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_54_49.331822", "path": ["results_2023-12-04T11-54-49.331822.parquet"]}, {"split": "2023_12_04T20_29_13.778100", "path": ["results_2023-12-04T20-29-13.778100.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T20-29-13.778100.parquet"]}]}]}
2023-12-04T20:32:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model migtissera/Tess-XS-v1-3-yarn-128K on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T20:29:13.778100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1-3-yarn-128K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T20:29:13.778100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1-3-yarn-128K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T20:29:13.778100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 29, 31, 178, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Tess-XS-v1-3-yarn-128K## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1-3-yarn-128K on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T20:29:13.778100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
0a21dad048e1ded37c1d06d6385dbb289c34d39d
# Dataset Card for Evaluation run of Minirecord/Mini_DPO_test02 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Minirecord/Mini_DPO_test02 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Minirecord/Mini_DPO_test02](https://huggingface.co/Minirecord/Mini_DPO_test02) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Minirecord__Mini_DPO_test02", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:56:14.353700](https://huggingface.co/datasets/open-llm-leaderboard/details_Minirecord__Mini_DPO_test02/blob/main/results_2023-12-04T11-56-14.353700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6168729607530143, "acc_stderr": 0.03275935687592929, "acc_norm": 0.6223546289993482, "acc_norm_stderr": 0.033429838036267936, "mc1": 0.31946144430844553, "mc1_stderr": 0.016322644182960498, "mc2": 0.48469236116826936, "mc2_stderr": 0.014679613330954367 }, "harness|arc:challenge|25": { "acc": 0.5699658703071673, "acc_stderr": 0.014467631559137994, "acc_norm": 0.5972696245733788, "acc_norm_stderr": 0.01433223630679014 }, "harness|hellaswag|10": { "acc": 0.6425014937263493, "acc_stderr": 0.00478283835222253, "acc_norm": 0.8388767177853017, "acc_norm_stderr": 0.0036689326296725643 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.0387813988879761, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.0387813988879761 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.028637235639800886, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.028637235639800886 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.04655010411319616, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.04655010411319616 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482757, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482757 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3835978835978836, "acc_stderr": 0.0250437573185202, "acc_norm": 0.3835978835978836, "acc_norm_stderr": 0.0250437573185202 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7129032258064516, "acc_stderr": 0.025736542745594528, "acc_norm": 0.7129032258064516, "acc_norm_stderr": 0.025736542745594528 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790482, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790482 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.02423353229775873, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.02423353229775873 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6230769230769231, "acc_stderr": 0.024570975364225995, "acc_norm": 0.6230769230769231, "acc_norm_stderr": 0.024570975364225995 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066468, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066468 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6218487394957983, "acc_stderr": 0.03149930577784906, "acc_norm": 0.6218487394957983, "acc_norm_stderr": 0.03149930577784906 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.01697028909045804, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.01697028909045804 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.44907407407407407, "acc_stderr": 0.03392238405321617, "acc_norm": 0.44907407407407407, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7892156862745098, "acc_stderr": 0.028626547912437406, "acc_norm": 0.7892156862745098, "acc_norm_stderr": 0.028626547912437406 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.759493670886076, "acc_stderr": 0.02782078198114968, "acc_norm": 0.759493670886076, "acc_norm_stderr": 0.02782078198114968 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.038448761397852714, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.038448761397852714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489122, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489122 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8084291187739464, "acc_stderr": 0.014072859310451949, "acc_norm": 0.8084291187739464, "acc_norm_stderr": 0.014072859310451949 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6763005780346821, "acc_stderr": 0.025190181327608405, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.025190181327608405 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2737430167597765, "acc_stderr": 0.014912413096372434, "acc_norm": 0.2737430167597765, "acc_norm_stderr": 0.014912413096372434 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6895424836601307, "acc_stderr": 0.026493033225145898, "acc_norm": 0.6895424836601307, "acc_norm_stderr": 0.026493033225145898 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6945337620578779, "acc_stderr": 0.026160584450140446, "acc_norm": 0.6945337620578779, "acc_norm_stderr": 0.026160584450140446 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765127, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765127 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.02958345203628407, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.02958345203628407 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44784876140808344, "acc_stderr": 0.012700582404768221, "acc_norm": 0.44784876140808344, "acc_norm_stderr": 0.012700582404768221 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.02916312857067073, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.02916312857067073 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.630718954248366, "acc_stderr": 0.019524316744866353, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.019524316744866353 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454132, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454132 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.02954774168764004, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.02954774168764004 }, "harness|truthfulqa:mc|0": { "mc1": 0.31946144430844553, "mc1_stderr": 0.016322644182960498, "mc2": 0.48469236116826936, "mc2_stderr": 0.014679613330954367 }, "harness|winogrande|5": { "acc": 0.7837411207576953, "acc_stderr": 0.01157061486140935 }, "harness|gsm8k|5": { "acc": 0.3502653525398029, "acc_stderr": 0.013140409455571276 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Minirecord__Mini_DPO_test02
[ "region:us" ]
2023-12-04T11:59:06+00:00
{"pretty_name": "Evaluation run of Minirecord/Mini_DPO_test02", "dataset_summary": "Dataset automatically created during the evaluation run of model [Minirecord/Mini_DPO_test02](https://huggingface.co/Minirecord/Mini_DPO_test02) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Minirecord__Mini_DPO_test02\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:56:14.353700](https://huggingface.co/datasets/open-llm-leaderboard/details_Minirecord__Mini_DPO_test02/blob/main/results_2023-12-04T11-56-14.353700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6168729607530143,\n \"acc_stderr\": 0.03275935687592929,\n \"acc_norm\": 0.6223546289993482,\n \"acc_norm_stderr\": 0.033429838036267936,\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.48469236116826936,\n \"mc2_stderr\": 0.014679613330954367\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137994,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679014\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6425014937263493,\n \"acc_stderr\": 0.00478283835222253,\n \"acc_norm\": 0.8388767177853017,\n \"acc_norm_stderr\": 0.0036689326296725643\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045804,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045804\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608405,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608405\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765127,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765127\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.012700582404768221,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.012700582404768221\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866353,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866353\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.48469236116826936,\n \"mc2_stderr\": 0.014679613330954367\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3502653525398029,\n \"acc_stderr\": 0.013140409455571276\n }\n}\n```", "repo_url": "https://huggingface.co/Minirecord/Mini_DPO_test02", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-56-14.353700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["**/details_harness|winogrande|5_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-56-14.353700.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_56_14.353700", "path": ["results_2023-12-04T11-56-14.353700.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-56-14.353700.parquet"]}]}]}
2023-12-04T11:59:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Minirecord/Mini_DPO_test02 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Minirecord/Mini_DPO_test02 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:56:14.353700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Minirecord/Mini_DPO_test02", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Minirecord/Mini_DPO_test02 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:56:14.353700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Minirecord/Mini_DPO_test02", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Minirecord/Mini_DPO_test02 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:56:14.353700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Minirecord/Mini_DPO_test02## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Minirecord/Mini_DPO_test02 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:56:14.353700(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cec40f3bb2cf087ad25ca351cd8054bdeefee8e3
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:57:24.228849](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2/blob/main/results_2023-12-04T11-57-24.228849.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5470210161245963, "acc_stderr": 0.033586335697642675, "acc_norm": 0.5564143725807108, "acc_norm_stderr": 0.03444006583011199, "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5299552830341843, "mc2_stderr": 0.01569290592260198 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650649, "acc_norm": 0.6006825938566553, "acc_norm_stderr": 0.014312094557946707 }, "harness|hellaswag|10": { "acc": 0.6502688707428799, "acc_stderr": 0.004759103432380757, "acc_norm": 0.8401712806213901, "acc_norm_stderr": 0.0036569821653861826 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.040260970832965634, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.040260970832965634 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5375722543352601, "acc_stderr": 0.0380168510452446, "acc_norm": 0.5375722543352601, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46808510638297873, "acc_stderr": 0.03261936918467382, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.35964912280701755, "acc_stderr": 0.045144961328736334, "acc_norm": 0.35964912280701755, "acc_norm_stderr": 0.045144961328736334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30158730158730157, "acc_stderr": 0.023636975996101813, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.023636975996101813 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.04073524322147125, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.04073524322147125 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6419354838709678, "acc_stderr": 0.027273890594300645, "acc_norm": 0.6419354838709678, "acc_norm_stderr": 0.027273890594300645 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4236453201970443, "acc_stderr": 0.034767257476490364, "acc_norm": 0.4236453201970443, "acc_norm_stderr": 0.034767257476490364 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.032424979581788166, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.032424979581788166 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7772020725388601, "acc_stderr": 0.03003114797764154, "acc_norm": 0.7772020725388601, "acc_norm_stderr": 0.03003114797764154 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5051282051282051, "acc_stderr": 0.025349672906838653, "acc_norm": 0.5051282051282051, "acc_norm_stderr": 0.025349672906838653 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.0279404571362284, "acc_norm": 0.3, "acc_norm_stderr": 0.0279404571362284 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5546218487394958, "acc_stderr": 0.032284106267163895, "acc_norm": 0.5546218487394958, "acc_norm_stderr": 0.032284106267163895 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7302752293577982, "acc_stderr": 0.019028486711115438, "acc_norm": 0.7302752293577982, "acc_norm_stderr": 0.019028486711115438 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.75, "acc_stderr": 0.03039153369274154, "acc_norm": 0.75, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.0283046579430353, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.0283046579430353 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776678, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776678 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.042438692422305246, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.042438692422305246 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.043300437496507416, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.043300437496507416 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.656441717791411, "acc_stderr": 0.037311335196738925, "acc_norm": 0.656441717791411, "acc_norm_stderr": 0.037311335196738925 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8205128205128205, "acc_stderr": 0.02514093595033544, "acc_norm": 0.8205128205128205, "acc_norm_stderr": 0.02514093595033544 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7471264367816092, "acc_stderr": 0.015543377313719683, "acc_norm": 0.7471264367816092, "acc_norm_stderr": 0.015543377313719683 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6242774566473989, "acc_stderr": 0.026074314851657083, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.026074314851657083 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34301675977653634, "acc_stderr": 0.015876912673057738, "acc_norm": 0.34301675977653634, "acc_norm_stderr": 0.015876912673057738 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6209150326797386, "acc_stderr": 0.027780141207023344, "acc_norm": 0.6209150326797386, "acc_norm_stderr": 0.027780141207023344 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.617363344051447, "acc_stderr": 0.027604689028581986, "acc_norm": 0.617363344051447, "acc_norm_stderr": 0.027604689028581986 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6141975308641975, "acc_stderr": 0.027085401226132146, "acc_norm": 0.6141975308641975, "acc_norm_stderr": 0.027085401226132146 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.02927553215970473, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.02927553215970473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42959582790091266, "acc_stderr": 0.012643004623790203, "acc_norm": 0.42959582790091266, "acc_norm_stderr": 0.012643004623790203 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5637254901960784, "acc_stderr": 0.02006287424353913, "acc_norm": 0.5637254901960784, "acc_norm_stderr": 0.02006287424353913 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6326530612244898, "acc_stderr": 0.03086214492108756, "acc_norm": 0.6326530612244898, "acc_norm_stderr": 0.03086214492108756 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7263681592039801, "acc_stderr": 0.03152439186555401, "acc_norm": 0.7263681592039801, "acc_norm_stderr": 0.03152439186555401 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.783625730994152, "acc_stderr": 0.031581495393387324, "acc_norm": 0.783625730994152, "acc_norm_stderr": 0.031581495393387324 }, "harness|truthfulqa:mc|0": { "mc1": 0.3769889840881273, "mc1_stderr": 0.016965517578930354, "mc2": 0.5299552830341843, "mc2_stderr": 0.01569290592260198 }, "harness|winogrande|5": { "acc": 0.7434885556432518, "acc_stderr": 0.012273648008759987 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.003282055917136976 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2
[ "region:us" ]
2023-12-04T12:00:19+00:00
{"pretty_name": "Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2", "dataset_summary": "Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Psyfighter2](https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:57:24.228849](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Psyfighter2/blob/main/results_2023-12-04T11-57-24.228849.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5470210161245963,\n \"acc_stderr\": 0.033586335697642675,\n \"acc_norm\": 0.5564143725807108,\n \"acc_norm_stderr\": 0.03444006583011199,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5299552830341843,\n \"mc2_stderr\": 0.01569290592260198\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946707\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n \"acc_stderr\": 0.004759103432380757,\n \"acc_norm\": 0.8401712806213901,\n \"acc_norm_stderr\": 0.0036569821653861826\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101813,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101813\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n \"acc_stderr\": 0.015543377313719683,\n \"acc_norm\": 0.7471264367816092,\n \"acc_norm_stderr\": 0.015543377313719683\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34301675977653634,\n \"acc_stderr\": 0.015876912673057738,\n \"acc_norm\": 0.34301675977653634,\n \"acc_norm_stderr\": 0.015876912673057738\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132146,\n \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132146\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n \"acc_stderr\": 0.012643004623790203,\n \"acc_norm\": 0.42959582790091266,\n \"acc_norm_stderr\": 0.012643004623790203\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5299552830341843,\n \"mc2_stderr\": 0.01569290592260198\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.003282055917136976\n }\n}\n```", "repo_url": "https://huggingface.co/KoboldAI/LLaMA2-13B-Psyfighter2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["**/details_harness|winogrande|5_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-57-24.228849.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_57_24.228849", "path": ["results_2023-12-04T11-57-24.228849.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-57-24.228849.parquet"]}]}]}
2023-12-04T12:01:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Psyfighter2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:57:24.228849(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Psyfighter2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:57:24.228849(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Psyfighter2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:57:24.228849(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Psyfighter2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KoboldAI/LLaMA2-13B-Psyfighter2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:57:24.228849(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8414e1001b12ee9d2e462738b146849cb4d71afe
# Dataset Card for Evaluation run of Medilora/medilora-qwen-14b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Medilora/medilora-qwen-14b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Medilora/medilora-qwen-14b](https://huggingface.co/Medilora/medilora-qwen-14b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Medilora__medilora-qwen-14b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T11:58:35.188155](https://huggingface.co/datasets/open-llm-leaderboard/details_Medilora__medilora-qwen-14b/blob/main/results_2023-12-04T11-58-35.188155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6559355919072443, "acc_stderr": 0.03225211233928835, "acc_norm": 0.65931119421255, "acc_norm_stderr": 0.03289326762517802, "mc1": 0.3243574051407589, "mc1_stderr": 0.01638797677964794, "mc2": 0.4774778409132138, "mc2_stderr": 0.015634826376144897 }, "harness|arc:challenge|25": { "acc": 0.5307167235494881, "acc_stderr": 0.014583792546304037, "acc_norm": 0.5665529010238908, "acc_norm_stderr": 0.0144813762245589 }, "harness|hellaswag|10": { "acc": 0.6042620991834295, "acc_stderr": 0.004880092083408044, "acc_norm": 0.7907787293367855, "acc_norm_stderr": 0.004059213774735545 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.03842498559395268, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.03842498559395268 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7916666666666666, "acc_stderr": 0.03396116205845334, "acc_norm": 0.7916666666666666, "acc_norm_stderr": 0.03396116205845334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6936416184971098, "acc_stderr": 0.035149425512674394, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.035149425512674394 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.040824829046386284, "acc_norm": 0.6, "acc_norm_stderr": 0.040824829046386284 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5291005291005291, "acc_stderr": 0.025707658614154947, "acc_norm": 0.5291005291005291, "acc_norm_stderr": 0.025707658614154947 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7677419354838709, "acc_stderr": 0.024022256130308235, "acc_norm": 0.7677419354838709, "acc_norm_stderr": 0.024022256130308235 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5911330049261084, "acc_stderr": 0.03459058815883232, "acc_norm": 0.5911330049261084, "acc_norm_stderr": 0.03459058815883232 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8484848484848485, "acc_stderr": 0.025545650426603617, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.025545650426603617 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.02423353229775873, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.02423353229775873 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.024035489676335065, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.024035489676335065 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3851851851851852, "acc_stderr": 0.029670906124630886, "acc_norm": 0.3851851851851852, "acc_norm_stderr": 0.029670906124630886 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7563025210084033, "acc_stderr": 0.027886828078380558, "acc_norm": 0.7563025210084033, "acc_norm_stderr": 0.027886828078380558 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.015919557829976044, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.015919557829976044 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6064814814814815, "acc_stderr": 0.03331747876370312, "acc_norm": 0.6064814814814815, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601446, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601446 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7130044843049327, "acc_stderr": 0.03036037971029195, "acc_norm": 0.7130044843049327, "acc_norm_stderr": 0.03036037971029195 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728742, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728742 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128137, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.01381633538997313, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.01381633538997313 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39106145251396646, "acc_stderr": 0.01632076376380838, "acc_norm": 0.39106145251396646, "acc_norm_stderr": 0.01632076376380838 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6851851851851852, "acc_stderr": 0.02584224870090217, "acc_norm": 0.6851851851851852, "acc_norm_stderr": 0.02584224870090217 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5104302477183833, "acc_stderr": 0.012767457253930657, "acc_norm": 0.5104302477183833, "acc_norm_stderr": 0.012767457253930657 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.028064998167040094, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.028064998167040094 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6584967320261438, "acc_stderr": 0.019184639328092487, "acc_norm": 0.6584967320261438, "acc_norm_stderr": 0.019184639328092487 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7591836734693878, "acc_stderr": 0.027372942201788163, "acc_norm": 0.7591836734693878, "acc_norm_stderr": 0.027372942201788163 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700637, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700637 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.032659863237109066, "acc_norm": 0.88, "acc_norm_stderr": 0.032659863237109066 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.3243574051407589, "mc1_stderr": 0.01638797677964794, "mc2": 0.4774778409132138, "mc2_stderr": 0.015634826376144897 }, "harness|winogrande|5": { "acc": 0.7490134175217048, "acc_stderr": 0.012185776220516137 }, "harness|gsm8k|5": { "acc": 0.5860500379075056, "acc_stderr": 0.013566991960151781 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Medilora__medilora-qwen-14b
[ "region:us" ]
2023-12-04T12:01:12+00:00
{"pretty_name": "Evaluation run of Medilora/medilora-qwen-14b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Medilora/medilora-qwen-14b](https://huggingface.co/Medilora/medilora-qwen-14b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Medilora__medilora-qwen-14b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T11:58:35.188155](https://huggingface.co/datasets/open-llm-leaderboard/details_Medilora__medilora-qwen-14b/blob/main/results_2023-12-04T11-58-35.188155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559355919072443,\n \"acc_stderr\": 0.03225211233928835,\n \"acc_norm\": 0.65931119421255,\n \"acc_norm_stderr\": 0.03289326762517802,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4774778409132138,\n \"mc2_stderr\": 0.015634826376144897\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304037,\n \"acc_norm\": 0.5665529010238908,\n \"acc_norm_stderr\": 0.0144813762245589\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6042620991834295,\n \"acc_stderr\": 0.004880092083408044,\n \"acc_norm\": 0.7907787293367855,\n \"acc_norm_stderr\": 0.004059213774735545\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.03396116205845334,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.03396116205845334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5291005291005291,\n \"acc_stderr\": 0.025707658614154947,\n \"acc_norm\": 0.5291005291005291,\n \"acc_norm_stderr\": 0.025707658614154947\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5911330049261084,\n \"acc_stderr\": 0.03459058815883232,\n \"acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.03459058815883232\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603617,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603617\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335065,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335065\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630886,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630886\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.027886828078380558,\n \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.027886828078380558\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.01381633538997313,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.01381633538997313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n \"acc_stderr\": 0.01632076376380838,\n \"acc_norm\": 0.39106145251396646,\n \"acc_norm_stderr\": 0.01632076376380838\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5104302477183833,\n \"acc_stderr\": 0.012767457253930657,\n \"acc_norm\": 0.5104302477183833,\n \"acc_norm_stderr\": 0.012767457253930657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4774778409132138,\n \"mc2_stderr\": 0.015634826376144897\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516137\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5860500379075056,\n \"acc_stderr\": 0.013566991960151781\n }\n}\n```", "repo_url": "https://huggingface.co/Medilora/medilora-qwen-14b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T11-58-35.188155.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["**/details_harness|winogrande|5_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T11-58-35.188155.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T11_58_35.188155", "path": ["results_2023-12-04T11-58-35.188155.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T11-58-35.188155.parquet"]}]}]}
2023-12-04T12:01:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Medilora/medilora-qwen-14b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Medilora/medilora-qwen-14b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T11:58:35.188155(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Medilora/medilora-qwen-14b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-qwen-14b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:58:35.188155(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Medilora/medilora-qwen-14b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-qwen-14b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T11:58:35.188155(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Medilora/medilora-qwen-14b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-qwen-14b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T11:58:35.188155(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5af46da0749f813ca943c0b559ed20ff76b29988
# Dataset Card for Evaluation run of ceadar-ie/FinanceConnect-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/ceadar-ie/FinanceConnect-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [ceadar-ie/FinanceConnect-13B](https://huggingface.co/ceadar-ie/FinanceConnect-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-12-10T15:47:22.242382](https://huggingface.co/datasets/open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B/blob/main/results_2023-12-10T15-47-22.242382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520672, "mc2": 0.37682302005478885, "mc2_stderr": 0.015200964572751172 }, "harness|truthfulqa:mc|0": { "mc1": 0.2484700122399021, "mc1_stderr": 0.015127427096520672, "mc2": 0.37682302005478885, "mc2_stderr": 0.015200964572751172 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B
[ "region:us" ]
2023-12-04T12:05:03+00:00
{"pretty_name": "Evaluation run of ceadar-ie/FinanceConnect-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ceadar-ie/FinanceConnect-13B](https://huggingface.co/ceadar-ie/FinanceConnect-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-10T15:47:22.242382](https://huggingface.co/datasets/open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B/blob/main/results_2023-12-10T15-47-22.242382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.37682302005478885,\n \"mc2_stderr\": 0.015200964572751172\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.37682302005478885,\n \"mc2_stderr\": 0.015200964572751172\n }\n}\n```", "repo_url": "https://huggingface.co/ceadar-ie/FinanceConnect-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|arc:challenge|25_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|gsm8k|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hellaswag|10_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T12-02-08.348872.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-08T02-38-39.240881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-08T02-38-39.240881.parquet"]}, {"split": "2023_12_10T14_56_57.370238", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T14-56-57.370238.parquet"]}, {"split": "2023_12_10T15_47_22.242382", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T15-47-22.242382.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-10T15-47-22.242382.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["**/details_harness|winogrande|5_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["**/details_harness|winogrande|5_2023-12-08T02-38-39.240881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-08T02-38-39.240881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T12_02_08.348872", "path": ["results_2023-12-04T12-02-08.348872.parquet"]}, {"split": "2023_12_08T02_38_39.240881", "path": ["results_2023-12-08T02-38-39.240881.parquet"]}, {"split": "2023_12_10T14_56_57.370238", "path": ["results_2023-12-10T14-56-57.370238.parquet"]}, {"split": "2023_12_10T15_47_22.242382", "path": ["results_2023-12-10T15-47-22.242382.parquet"]}, {"split": "latest", "path": ["results_2023-12-10T15-47-22.242382.parquet"]}]}]}
2023-12-10T15:47:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ceadar-ie/FinanceConnect-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model ceadar-ie/FinanceConnect-13B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-10T15:47:22.242382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of ceadar-ie/FinanceConnect-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ceadar-ie/FinanceConnect-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-10T15:47:22.242382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ceadar-ie/FinanceConnect-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model ceadar-ie/FinanceConnect-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-10T15:47:22.242382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 19, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ceadar-ie/FinanceConnect-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ceadar-ie/FinanceConnect-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-10T15:47:22.242382(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
df295b7c77bf9a31f8b6e37d1c999f9891d24589
# Dataset Card for Evaluation run of Medilora/medilora-mistral-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Medilora/medilora-mistral-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Medilora/medilora-mistral-7b](https://huggingface.co/Medilora/medilora-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Medilora__medilora-mistral-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T12:09:16.453203](https://huggingface.co/datasets/open-llm-leaderboard/details_Medilora__medilora-mistral-7b/blob/main/results_2023-12-04T12-09-16.453203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6225779223336574, "acc_stderr": 0.03262923869284095, "acc_norm": 0.6256207177826597, "acc_norm_stderr": 0.03328227496804697, "mc1": 0.3329253365973072, "mc1_stderr": 0.016497402382012055, "mc2": 0.49909817615808516, "mc2_stderr": 0.014849029663559004 }, "harness|arc:challenge|25": { "acc": 0.5861774744027304, "acc_stderr": 0.014392730009221007, "acc_norm": 0.6168941979522184, "acc_norm_stderr": 0.014206472661672874 }, "harness|hellaswag|10": { "acc": 0.6324437363075085, "acc_stderr": 0.004811543077792712, "acc_norm": 0.8313085042820155, "acc_norm_stderr": 0.003737138752336941 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847415, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847415 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5481481481481482, "acc_stderr": 0.042992689054808644, "acc_norm": 0.5481481481481482, "acc_norm_stderr": 0.042992689054808644 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926604, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.660377358490566, "acc_stderr": 0.029146904747798335, "acc_norm": 0.660377358490566, "acc_norm_stderr": 0.029146904747798335 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887248, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.032671518489247764, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.032671518489247764 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5350877192982456, "acc_stderr": 0.046920083813689104, "acc_norm": 0.5350877192982456, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.0416656757710158, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.0416656757710158 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.02533120243894443, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.02533120243894443 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04426266681379909, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04426266681379909 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7387096774193549, "acc_stderr": 0.024993053397764815, "acc_norm": 0.7387096774193549, "acc_norm_stderr": 0.024993053397764815 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5221674876847291, "acc_stderr": 0.03514528562175008, "acc_norm": 0.5221674876847291, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.0315841532404771, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.0315841532404771 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217483, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217483 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121437, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6333333333333333, "acc_stderr": 0.024433016466052466, "acc_norm": 0.6333333333333333, "acc_norm_stderr": 0.024433016466052466 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815632, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815632 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8293577981651377, "acc_stderr": 0.016129271025099864, "acc_norm": 0.8293577981651377, "acc_norm_stderr": 0.016129271025099864 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640766, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640766 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676166, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676166 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.03138147637575499, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.03138147637575499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596915, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596915 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.03957835471980981, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.03957835471980981 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489124, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489124 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597524, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597524 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8058748403575989, "acc_stderr": 0.014143970276657574, "acc_norm": 0.8058748403575989, "acc_norm_stderr": 0.014143970276657574 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6878612716763006, "acc_stderr": 0.024946792225272314, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.024946792225272314 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33743016759776534, "acc_stderr": 0.015813901283913048, "acc_norm": 0.33743016759776534, "acc_norm_stderr": 0.015813901283913048 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.662379421221865, "acc_stderr": 0.026858825879488544, "acc_norm": 0.662379421221865, "acc_norm_stderr": 0.026858825879488544 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.691358024691358, "acc_stderr": 0.02570264026060374, "acc_norm": 0.691358024691358, "acc_norm_stderr": 0.02570264026060374 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5106382978723404, "acc_stderr": 0.02982074719142244, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.02982074719142244 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45632333767926986, "acc_stderr": 0.012721420501462546, "acc_norm": 0.45632333767926986, "acc_norm_stderr": 0.012721420501462546 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6421568627450981, "acc_stderr": 0.019393058402355442, "acc_norm": 0.6421568627450981, "acc_norm_stderr": 0.019393058402355442 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233264, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233264 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3329253365973072, "mc1_stderr": 0.016497402382012055, "mc2": 0.49909817615808516, "mc2_stderr": 0.014849029663559004 }, "harness|winogrande|5": { "acc": 0.77663772691397, "acc_stderr": 0.011705697565205201 }, "harness|gsm8k|5": { "acc": 0.5185746777862017, "acc_stderr": 0.013762977910317584 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Medilora__medilora-mistral-7b
[ "region:us" ]
2023-12-04T12:12:06+00:00
{"pretty_name": "Evaluation run of Medilora/medilora-mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Medilora/medilora-mistral-7b](https://huggingface.co/Medilora/medilora-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Medilora__medilora-mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T12:09:16.453203](https://huggingface.co/datasets/open-llm-leaderboard/details_Medilora__medilora-mistral-7b/blob/main/results_2023-12-04T12-09-16.453203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6225779223336574,\n \"acc_stderr\": 0.03262923869284095,\n \"acc_norm\": 0.6256207177826597,\n \"acc_norm_stderr\": 0.03328227496804697,\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49909817615808516,\n \"mc2_stderr\": 0.014849029663559004\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221007,\n \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672874\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6324437363075085,\n \"acc_stderr\": 0.004811543077792712,\n \"acc_norm\": 0.8313085042820155,\n \"acc_norm_stderr\": 0.003737138752336941\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052466,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052466\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099864,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099864\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489124,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489124\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657574,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657574\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.02570264026060374,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.02570264026060374\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45632333767926986,\n \"acc_stderr\": 0.012721420501462546,\n \"acc_norm\": 0.45632333767926986,\n \"acc_norm_stderr\": 0.012721420501462546\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355442,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355442\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n \"mc1_stderr\": 0.016497402382012055,\n \"mc2\": 0.49909817615808516,\n \"mc2_stderr\": 0.014849029663559004\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5185746777862017,\n \"acc_stderr\": 0.013762977910317584\n }\n}\n```", "repo_url": "https://huggingface.co/Medilora/medilora-mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|arc:challenge|25_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|gsm8k|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hellaswag|10_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T12-09-16.453203.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["**/details_harness|winogrande|5_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T12-09-16.453203.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T12_09_16.453203", "path": ["results_2023-12-04T12-09-16.453203.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T12-09-16.453203.parquet"]}]}]}
2023-12-04T12:12:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Medilora/medilora-mistral-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Medilora/medilora-mistral-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T12:09:16.453203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Medilora/medilora-mistral-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T12:09:16.453203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Medilora/medilora-mistral-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T12:09:16.453203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Medilora/medilora-mistral-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Medilora/medilora-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T12:09:16.453203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
8c82f57341f2314a58faf7b0f8ebbbb78164856c
# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/orca_mini_v3_13B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T12:38:59.699618](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ/blob/main/results_2023-12-04T12-38-59.699618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5613401785987572, "acc_stderr": 0.033576900106646816, "acc_norm": 0.5663280514839403, "acc_norm_stderr": 0.0342786577705747, "mc1": 0.3561811505507956, "mc1_stderr": 0.016763790728446335, "mc2": 0.4922092515317753, "mc2_stderr": 0.015510989644544924 }, "harness|arc:challenge|25": { "acc": 0.5930034129692833, "acc_stderr": 0.01435639941800912, "acc_norm": 0.6194539249146758, "acc_norm_stderr": 0.014188277712349814 }, "harness|hellaswag|10": { "acc": 0.617805218084047, "acc_stderr": 0.004849306998727771, "acc_norm": 0.81557458673571, "acc_norm_stderr": 0.003870381199967957 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5, "acc_stderr": 0.04068942293855797, "acc_norm": 0.5, "acc_norm_stderr": 0.04068942293855797 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6188679245283019, "acc_stderr": 0.029890609686286637, "acc_norm": 0.6188679245283019, "acc_norm_stderr": 0.029890609686286637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5902777777777778, "acc_stderr": 0.041124909746707884, "acc_norm": 0.5902777777777778, "acc_norm_stderr": 0.041124909746707884 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.03811890988940412, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.29411764705882354, "acc_stderr": 0.04533838195929776, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.04533838195929776 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4808510638297872, "acc_stderr": 0.03266204299064678, "acc_norm": 0.4808510638297872, "acc_norm_stderr": 0.03266204299064678 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159394, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159394 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728763, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728763 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523846, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523846 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04216370213557836, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04216370213557836 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6387096774193548, "acc_stderr": 0.02732754844795754, "acc_norm": 0.6387096774193548, "acc_norm_stderr": 0.02732754844795754 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43842364532019706, "acc_stderr": 0.03491207857486519, "acc_norm": 0.43842364532019706, "acc_norm_stderr": 0.03491207857486519 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7373737373737373, "acc_stderr": 0.03135305009533086, "acc_norm": 0.7373737373737373, "acc_norm_stderr": 0.03135305009533086 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8082901554404145, "acc_stderr": 0.02840895362624527, "acc_norm": 0.8082901554404145, "acc_norm_stderr": 0.02840895362624527 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.541025641025641, "acc_stderr": 0.025265525491284295, "acc_norm": 0.541025641025641, "acc_norm_stderr": 0.025265525491284295 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02831753349606647, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02831753349606647 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5840336134453782, "acc_stderr": 0.032016501007396114, "acc_norm": 0.5840336134453782, "acc_norm_stderr": 0.032016501007396114 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389024, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7541284403669725, "acc_stderr": 0.018461940968708436, "acc_norm": 0.7541284403669725, "acc_norm_stderr": 0.018461940968708436 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977749, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7352941176470589, "acc_stderr": 0.030964517926923403, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.030964517926923403 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.027652153144159256, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.027652153144159256 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6278026905829597, "acc_stderr": 0.03244305283008731, "acc_norm": 0.6278026905829597, "acc_norm_stderr": 0.03244305283008731 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6748466257668712, "acc_stderr": 0.036803503712864616, "acc_norm": 0.6748466257668712, "acc_norm_stderr": 0.036803503712864616 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.04521829902833585, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.04521829902833585 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0272360139461967, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0272360139461967 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.756066411238825, "acc_stderr": 0.015357212665829461, "acc_norm": 0.756066411238825, "acc_norm_stderr": 0.015357212665829461 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6473988439306358, "acc_stderr": 0.025722802200895803, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.025722802200895803 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3787709497206704, "acc_stderr": 0.016223533510365113, "acc_norm": 0.3787709497206704, "acc_norm_stderr": 0.016223533510365113 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6078431372549019, "acc_stderr": 0.027956046165424516, "acc_norm": 0.6078431372549019, "acc_norm_stderr": 0.027956046165424516 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6655948553054662, "acc_stderr": 0.026795422327893934, "acc_norm": 0.6655948553054662, "acc_norm_stderr": 0.026795422327893934 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6327160493827161, "acc_stderr": 0.0268228017595079, "acc_norm": 0.6327160493827161, "acc_norm_stderr": 0.0268228017595079 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4198174706649283, "acc_stderr": 0.012604960816087377, "acc_norm": 0.4198174706649283, "acc_norm_stderr": 0.012604960816087377 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5669934640522876, "acc_stderr": 0.020045442473324227, "acc_norm": 0.5669934640522876, "acc_norm_stderr": 0.020045442473324227 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726496, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726496 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6616915422885572, "acc_stderr": 0.033455630703391935, "acc_norm": 0.6616915422885572, "acc_norm_stderr": 0.033455630703391935 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.038786267710023595, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.03218093795602357, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.3561811505507956, "mc1_stderr": 0.016763790728446335, "mc2": 0.4922092515317753, "mc2_stderr": 0.015510989644544924 }, "harness|winogrande|5": { "acc": 0.7576953433307024, "acc_stderr": 0.012042352526174789 }, "harness|gsm8k|5": { "acc": 0.29492039423805916, "acc_stderr": 0.01256069801095475 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ
[ "region:us" ]
2023-12-04T12:42:00+00:00
{"pretty_name": "Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/orca_mini_v3_13B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T12:38:59.699618](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ/blob/main/results_2023-12-04T12-38-59.699618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5613401785987572,\n \"acc_stderr\": 0.033576900106646816,\n \"acc_norm\": 0.5663280514839403,\n \"acc_norm_stderr\": 0.0342786577705747,\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4922092515317753,\n \"mc2_stderr\": 0.015510989644544924\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617805218084047,\n \"acc_stderr\": 0.004849306998727771,\n \"acc_norm\": 0.81557458673571,\n \"acc_norm_stderr\": 0.003870381199967957\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.041124909746707884,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.041124909746707884\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523846,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523846\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708436,\n \"acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708436\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0272360139461967,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0272360139461967\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n \"acc_stderr\": 0.015357212665829461,\n \"acc_norm\": 0.756066411238825,\n \"acc_norm_stderr\": 0.015357212665829461\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n \"acc_stderr\": 0.016223533510365113,\n \"acc_norm\": 0.3787709497206704,\n \"acc_norm_stderr\": 0.016223533510365113\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.0268228017595079,\n \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.0268228017595079\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n \"acc_stderr\": 0.012604960816087377,\n \"acc_norm\": 0.4198174706649283,\n \"acc_norm_stderr\": 0.012604960816087377\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n \"acc_stderr\": 0.033455630703391935,\n \"acc_norm\": 0.6616915422885572,\n \"acc_norm_stderr\": 0.033455630703391935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4922092515317753,\n \"mc2_stderr\": 0.015510989644544924\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29492039423805916,\n \"acc_stderr\": 0.01256069801095475\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|arc:challenge|25_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|gsm8k|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hellaswag|10_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["**/details_harness|winogrande|5_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T12-38-59.699618.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T12_38_59.699618", "path": ["results_2023-12-04T12-38-59.699618.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T12-38-59.699618.parquet"]}]}]}
2023-12-04T12:42:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model TheBloke/orca_mini_v3_13B-GPTQ on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T12:38:59.699618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/orca_mini_v3_13B-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T12:38:59.699618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/orca_mini_v3_13B-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T12:38:59.699618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/orca_mini_v3_13B-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T12:38:59.699618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
60810824c39d1c7d97ce2454e14abb9e44a5fd57
# Dataset Card for Cooperate Phishing <!-- Provide a quick summary of the dataset. --> Bilic Phishing Corpus contains over 20k chat dialogues with annotated summaries. There are two features: - dialogue: text of dialogue. - summary: A summary of the dialogure to determine phishing attempt. " ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Dialogue:** Intra Cooperate Conversation - **Summary:** A summary of the dialogure to determine phishing attempt ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** https://huggingface.co/datasets/samsum ## Uses Build models to Anti- Phishing detection in a conversation or dialogue ### Direct Use For fraud prevention, security and clasfficiation models [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Creation ### Curation Rationale Below is an example of the dialogue and summary synthetic data we used to train a Mistral 7b model for real time fraud detection. We were faced with data constraints for real time dailogure, synthetic data becomes our ally in enhancing model performance. [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
Bilic/phishing
[ "task_categories:text-classification", "task_categories:summarization", "task_categories:question-answering", "task_categories:conversational", "task_categories:text-to-speech", "size_categories:10K<n<100K", "license:mit", "phishing", "fraud", "region:us" ]
2023-12-04T12:52:14+00:00
{"license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification", "summarization", "question-answering", "conversational", "text-to-speech"], "pretty_name": "bilic-phishing", "tags": ["phishing", "fraud"]}
2023-12-04T13:13:31+00:00
[]
[]
TAGS #task_categories-text-classification #task_categories-summarization #task_categories-question-answering #task_categories-conversational #task_categories-text-to-speech #size_categories-10K<n<100K #license-mit #phishing #fraud #region-us
# Dataset Card for Cooperate Phishing Bilic Phishing Corpus contains over 20k chat dialogues with annotated summaries. There are two features: - dialogue: text of dialogue. - summary: A summary of the dialogure to determine phishing attempt. " ## Dataset Details ### Dataset Description - Dialogue: Intra Cooperate Conversation - Summary: A summary of the dialogure to determine phishing attempt ### Dataset Sources [optional] - Repository: URL ## Uses Build models to Anti- Phishing detection in a conversation or dialogue ### Direct Use For fraud prevention, security and clasfficiation models ### Out-of-Scope Use ## Dataset Creation ### Curation Rationale Below is an example of the dialogue and summary synthetic data we used to train a Mistral 7b model for real time fraud detection. We were faced with data constraints for real time dailogure, synthetic data becomes our ally in enhancing model performance. ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Cooperate Phishing\n\n\n\nBilic Phishing Corpus contains over 20k chat dialogues with annotated summaries. There are two features: - dialogue: text of dialogue. - summary: A summary of the dialogure to determine phishing attempt. \"", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Dialogue: Intra Cooperate Conversation \n- Summary: A summary of the dialogure to determine phishing attempt", "### Dataset Sources [optional]\n\n\n\n- Repository: URL", "## Uses\n\nBuild models to Anti- Phishing detection in a conversation or dialogue", "### Direct Use\n\nFor fraud prevention, security and clasfficiation models", "### Out-of-Scope Use", "## Dataset Creation", "### Curation Rationale\n\nBelow is an example of the dialogue and summary synthetic data we used to train a Mistral 7b model for real time fraud detection. We were faced with data constraints for real time dailogure, synthetic data becomes our ally in enhancing model performance.", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-text-classification #task_categories-summarization #task_categories-question-answering #task_categories-conversational #task_categories-text-to-speech #size_categories-10K<n<100K #license-mit #phishing #fraud #region-us \n", "# Dataset Card for Cooperate Phishing\n\n\n\nBilic Phishing Corpus contains over 20k chat dialogues with annotated summaries. There are two features: - dialogue: text of dialogue. - summary: A summary of the dialogure to determine phishing attempt. \"", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Dialogue: Intra Cooperate Conversation \n- Summary: A summary of the dialogure to determine phishing attempt", "### Dataset Sources [optional]\n\n\n\n- Repository: URL", "## Uses\n\nBuild models to Anti- Phishing detection in a conversation or dialogue", "### Direct Use\n\nFor fraud prevention, security and clasfficiation models", "### Out-of-Scope Use", "## Dataset Creation", "### Curation Rationale\n\nBelow is an example of the dialogue and summary synthetic data we used to train a Mistral 7b model for real time fraud detection. We were faced with data constraints for real time dailogure, synthetic data becomes our ally in enhancing model performance.", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ 85, 57, 4, 30, 16, 18, 17, 9, 5, 71, 4, 7, 10, 9, 5, 9, 8, 10, 46, 8, 7, 10, 5 ]
[ "passage: TAGS\n#task_categories-text-classification #task_categories-summarization #task_categories-question-answering #task_categories-conversational #task_categories-text-to-speech #size_categories-10K<n<100K #license-mit #phishing #fraud #region-us \n# Dataset Card for Cooperate Phishing\n\n\n\nBilic Phishing Corpus contains over 20k chat dialogues with annotated summaries. There are two features: - dialogue: text of dialogue. - summary: A summary of the dialogure to determine phishing attempt. \"## Dataset Details### Dataset Description\n\n\n\n\n\n- Dialogue: Intra Cooperate Conversation \n- Summary: A summary of the dialogure to determine phishing attempt### Dataset Sources [optional]\n\n\n\n- Repository: URL## Uses\n\nBuild models to Anti- Phishing detection in a conversation or dialogue### Direct Use\n\nFor fraud prevention, security and clasfficiation models### Out-of-Scope Use## Dataset Creation### Curation Rationale\n\nBelow is an example of the dialogue and summary synthetic data we used to train a Mistral 7b model for real time fraud detection. We were faced with data constraints for real time dailogure, synthetic data becomes our ally in enhancing model performance.### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact" ]
a9c20705da7100226ec794323861ff75eec5d773
# Dataset Card for "GPTSilkomePretrained" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mjbuehler/GPTSilkomePretrained
[ "region:us" ]
2023-12-04T12:54:17+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 213036133, "num_examples": 731354}], "download_size": 203708011, "dataset_size": 213036133}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T12:54:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "GPTSilkomePretrained" More Information needed
[ "# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
3369cec143ca38a5677205e79ba91e3396257339
# ssl-checkpoints The code to load the checkpoints to follow... The repository is organised as follows: - Each folder corresponds to the data set used for our experiment. - Each subfolder represents the corresponding SSL technique used. - These subfolders contain the checkpoints for each transformation/pretext task considered. The five checkpoint files correspond to the transformation Baseline, SimClr, Orthogonality, LoRot and DCL, respectively, described in the blog.
CATIE-AQ/ssl-checkpoints
[ "region:us" ]
2023-12-04T12:55:26+00:00
{}
2023-12-05T20:05:38+00:00
[]
[]
TAGS #region-us
# ssl-checkpoints The code to load the checkpoints to follow... The repository is organised as follows: - Each folder corresponds to the data set used for our experiment. - Each subfolder represents the corresponding SSL technique used. - These subfolders contain the checkpoints for each transformation/pretext task considered. The five checkpoint files correspond to the transformation Baseline, SimClr, Orthogonality, LoRot and DCL, respectively, described in the blog.
[ "# ssl-checkpoints\n\nThe code to load the checkpoints to follow...\nThe repository is organised as follows:\n\n- Each folder corresponds to the data set used for our experiment.\n- Each subfolder represents the corresponding SSL technique used.\n- These subfolders contain the checkpoints for each transformation/pretext task considered. The five checkpoint files correspond to\n the transformation Baseline, SimClr, Orthogonality, LoRot and DCL, respectively, described in the blog." ]
[ "TAGS\n#region-us \n", "# ssl-checkpoints\n\nThe code to load the checkpoints to follow...\nThe repository is organised as follows:\n\n- Each folder corresponds to the data set used for our experiment.\n- Each subfolder represents the corresponding SSL technique used.\n- These subfolders contain the checkpoints for each transformation/pretext task considered. The five checkpoint files correspond to\n the transformation Baseline, SimClr, Orthogonality, LoRot and DCL, respectively, described in the blog." ]
[ 6, 113 ]
[ "passage: TAGS\n#region-us \n# ssl-checkpoints\n\nThe code to load the checkpoints to follow...\nThe repository is organised as follows:\n\n- Each folder corresponds to the data set used for our experiment.\n- Each subfolder represents the corresponding SSL technique used.\n- These subfolders contain the checkpoints for each transformation/pretext task considered. The five checkpoint files correspond to\n the transformation Baseline, SimClr, Orthogonality, LoRot and DCL, respectively, described in the blog." ]
8687b59dbe8853be7a4367cb9d82d95a40724bdb
# Dataset Card for "GPTSilkomePretrained" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lamm-mit/SilkomePretrainedGPT
[ "region:us" ]
2023-12-04T13:01:11+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 213036133, "num_examples": 731354}], "download_size": 203708011, "dataset_size": 213036133}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T13:01:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "GPTSilkomePretrained" More Information needed
[ "# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"GPTSilkomePretrained\"\n\nMore Information needed" ]
ca2553af198ee670f2c0b097201ded609a334b10
# Synthetic Malaysian Extractive QA Generated using ChatGPT3.5, notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/question-answer/chatgpt3.5-extractive-news
mesolitica/chatgpt-malaysian-extractive-qa
[ "task_categories:question-answering", "language:ms", "region:us" ]
2023-12-04T13:06:49+00:00
{"language": ["ms"], "task_categories": ["question-answering"], "pretty_name": "synthetic-malaysian-extract-qa"}
2023-12-04T13:32:29+00:00
[]
[ "ms" ]
TAGS #task_categories-question-answering #language-Malay (macrolanguage) #region-us
# Synthetic Malaysian Extractive QA Generated using ChatGPT3.5, notebooks at URL
[ "# Synthetic Malaysian Extractive QA\n\nGenerated using ChatGPT3.5, notebooks at URL" ]
[ "TAGS\n#task_categories-question-answering #language-Malay (macrolanguage) #region-us \n", "# Synthetic Malaysian Extractive QA\n\nGenerated using ChatGPT3.5, notebooks at URL" ]
[ 28, 23 ]
[ "passage: TAGS\n#task_categories-question-answering #language-Malay (macrolanguage) #region-us \n# Synthetic Malaysian Extractive QA\n\nGenerated using ChatGPT3.5, notebooks at URL" ]
98acad6712f12bb938bb0a8cf025bbace114c7e5
# Inria Aerial Image Labeling Dataset ![Inria Aerial Image Labeling](./thumbnail.png) ## Description The Inria Aerial Image Labeling Dataset is a building semantic segmentation dataset proposed in ["Can semantic labeling methods generalize to any city? the inria aerial image labeling benchmark," Maggiori et al.](https://ieeexplore.ieee.org/document/8127684). It consists of 360 high-resolution (0.3m) RGB images, each with a size of 5000x5000 pixels. These images are extracted from various international GIS services, such as the [USGS National Map](https://www.usgs.gov/core-science-systems/national-geospatial-program/national-map). Project page: https://project.inria.fr/aerialimagelabeling/ ## Details ## Structure ```tree . ├── README.md └── data    ├── test    │   └── images    │      ├── bellingham1.tif    │      ├── bellingham2.tif    │      ├── ...    │      └── tyrol-e36.tif    └── train       ├── gt       │   ├── austin1.tif       │   ├── austin2.tif       │   ├── ...       │   └── vienna36.tif       └── images          ├── austin1.tif          ├── austin2.tif          ├── ...          └── vienna36.tif ``` ### Statistics - **Image Resolution:** 0.3 meters per pixel - **Image Size:** 5000x5000 pixels - **Total Images:** 360 - **Regions:** 10 regions around the world, including both urban and rural areas. - **Split:** Train and test sets are split into different cities for evaluating model generalization across dramatically different locations. - **Test Set Ground Truth Masks:** Note that the ground truth masks for the test set have not been publicly released. The dataset was originally used in the [Inria Aerial Image Labeling Dataset Contest](https://project.inria.fr/aerialimagelabeling/contest/). ### About the Dataset The Inria Aerial Image Labeling Dataset is a comprehensive resource for semantic segmentation tasks in the field of remote sensing, with additional information as follows: - **Dataset Coverage:** The dataset spans a total area of 810 km², meticulously divided into 405 km² for training and another 405 km² for testing purposes. - **Image Characteristics:** This dataset offers aerial orthorectified color imagery, capturing scenes at an impressive spatial resolution of 0.3 meters per pixel. - **Semantic Classes:** Ground truth data is provided for two fundamental semantic classes: "building" and "not building." It's important to note that ground truth data for the "not building" class is publicly disclosed exclusively for the training subset. - **Diverse Urban Settlements:** The images cover a diverse range of urban settlements, ranging from densely populated areas such as San Francisco's financial district to picturesque alpine towns like Lienz in Austrian Tyrol. - **City-Based Split:** Instead of merely dividing adjacent portions of the same images into the training and test subsets, this dataset adopts a unique approach. Different cities are included in each of the subsets. For instance, images from Chicago are part of the training set and excluded from the test set, while images from San Francisco are included in the test set and not in the training set. This design aims to assess the generalization capabilities of semantic labeling techniques across regions with varying illumination conditions, urban landscapes, and times of the year. - **Data Sources:** The dataset was meticulously constructed by combining publicly available imagery and official building footprints. This additional information further enriches the understanding of the Inria Aerial Image Labeling Dataset and its potential applications in remote sensing research. <!-- ### Sample Images You can explore sample images from this dataset: - ![Chicago](https://project.inria.fr/aerialimagelabeling/files/2011/12/chi1.jpg) Chicago - ![Chicago - Reference](https://project.inria.fr/aerialimagelabeling/files/2011/12/chi2.jpg) Chicago - Reference - ![Kitsap County, WA](https://project.inria.hal.science/hal-01468452/document/kit1.jpg) Kitsap County, WA - ![Kitsap County, WA - Reference](https://project.inria.hal.science/hal-01468452/document/kit2.jpg) Kitsap County, WA - Reference - ![Vienna](https://project.inria.hal.science/hal-01468452/document/vie1.jpg) Vienna - ![Vienna - Reference](https://project.inria.hal.science/hal-01468452/document/vie2.jpg) Vienna - Reference --> ## Citation If you use the Inria Aerial Image Labeling Dataset dataset in your research, please consider citing the following publication or the dataset's official website: ```bibtex @article{xia2017aid, title = {AID: A benchmark data set for performance evaluation of aerial scene classification}, author = {Xia, Gui-Song and Hu, Jingwen and Hu, Fan and Shi, Baoguang and Bai, Xiang and Zhong, Yanfei and Zhang, Liangpei and Lu, Xiaoqiang}, journal = {IEEE Transactions on Geoscience and Remote Sensing}, volume = {55}, number = {7}, pages = {3965-3981}, year = {2017}, publisher = {IEEE} } ``` [AID: A Benchmark Dataset for Performance Evaluation of Aerial Scene Classification](https://arxiv.org/pdf/1608.05167v1.pdf)
blanchon/INRIA-Aerial-Image-Labeling
[ "task_categories:image-segmentation", "size_categories:n<1K", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-segmentation", "semantic-segmentation", "building-labeling", "arxiv:1608.05167", "region:us" ]
2023-12-04T13:17:26+00:00
{"language": ["en"], "license": ["unknown"], "size_categories": ["n<1K"], "task_categories": ["image-segmentation"], "pretty_name": "Inria Aerial Image Labeling Dataset", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-segmentation", "semantic-segmentation", "building-labeling"]}
2023-12-04T14:47:42+00:00
[ "1608.05167" ]
[ "en" ]
TAGS #task_categories-image-segmentation #size_categories-n<1K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-segmentation #semantic-segmentation #building-labeling #arxiv-1608.05167 #region-us
# Inria Aerial Image Labeling Dataset !Inria Aerial Image Labeling ## Description The Inria Aerial Image Labeling Dataset is a building semantic segmentation dataset proposed in "Can semantic labeling methods generalize to any city? the inria aerial image labeling benchmark," Maggiori et al.. It consists of 360 high-resolution (0.3m) RGB images, each with a size of 5000x5000 pixels. These images are extracted from various international GIS services, such as the USGS National Map. Project page: URL ## Details ## Structure ### Statistics - Image Resolution: 0.3 meters per pixel - Image Size: 5000x5000 pixels - Total Images: 360 - Regions: 10 regions around the world, including both urban and rural areas. - Split: Train and test sets are split into different cities for evaluating model generalization across dramatically different locations. - Test Set Ground Truth Masks: Note that the ground truth masks for the test set have not been publicly released. The dataset was originally used in the Inria Aerial Image Labeling Dataset Contest. ### About the Dataset The Inria Aerial Image Labeling Dataset is a comprehensive resource for semantic segmentation tasks in the field of remote sensing, with additional information as follows: - Dataset Coverage: The dataset spans a total area of 810 km², meticulously divided into 405 km² for training and another 405 km² for testing purposes. - Image Characteristics: This dataset offers aerial orthorectified color imagery, capturing scenes at an impressive spatial resolution of 0.3 meters per pixel. - Semantic Classes: Ground truth data is provided for two fundamental semantic classes: "building" and "not building." It's important to note that ground truth data for the "not building" class is publicly disclosed exclusively for the training subset. - Diverse Urban Settlements: The images cover a diverse range of urban settlements, ranging from densely populated areas such as San Francisco's financial district to picturesque alpine towns like Lienz in Austrian Tyrol. - City-Based Split: Instead of merely dividing adjacent portions of the same images into the training and test subsets, this dataset adopts a unique approach. Different cities are included in each of the subsets. For instance, images from Chicago are part of the training set and excluded from the test set, while images from San Francisco are included in the test set and not in the training set. This design aims to assess the generalization capabilities of semantic labeling techniques across regions with varying illumination conditions, urban landscapes, and times of the year. - Data Sources: The dataset was meticulously constructed by combining publicly available imagery and official building footprints. This additional information further enriches the understanding of the Inria Aerial Image Labeling Dataset and its potential applications in remote sensing research. If you use the Inria Aerial Image Labeling Dataset dataset in your research, please consider citing the following publication or the dataset's official website: AID: A Benchmark Dataset for Performance Evaluation of Aerial Scene Classification
[ "# Inria Aerial Image Labeling Dataset\n\n!Inria Aerial Image Labeling", "## Description\n\nThe Inria Aerial Image Labeling Dataset is a building semantic segmentation dataset proposed in \"Can semantic labeling methods generalize to any city? the inria aerial image labeling benchmark,\" Maggiori et al.. It consists of 360 high-resolution (0.3m) RGB images, each with a size of 5000x5000 pixels. These images are extracted from various international GIS services, such as the USGS National Map.\n\nProject page: URL", "## Details", "## Structure", "### Statistics\n\n- Image Resolution: 0.3 meters per pixel\n- Image Size: 5000x5000 pixels\n- Total Images: 360\n- Regions: 10 regions around the world, including both urban and rural areas.\n- Split: Train and test sets are split into different cities for evaluating model generalization across dramatically different locations.\n- Test Set Ground Truth Masks: Note that the ground truth masks for the test set have not been publicly released.\n\nThe dataset was originally used in the Inria Aerial Image Labeling Dataset Contest.", "### About the Dataset\n\nThe Inria Aerial Image Labeling Dataset is a comprehensive resource for semantic segmentation tasks in the field of remote sensing, with additional information as follows:\n\n- Dataset Coverage: The dataset spans a total area of 810 km², meticulously divided into 405 km² for training and another 405 km² for testing purposes.\n\n- Image Characteristics: This dataset offers aerial orthorectified color imagery, capturing scenes at an impressive spatial resolution of 0.3 meters per pixel.\n\n- Semantic Classes: Ground truth data is provided for two fundamental semantic classes: \"building\" and \"not building.\" It's important to note that ground truth data for the \"not building\" class is publicly disclosed exclusively for the training subset.\n\n- Diverse Urban Settlements: The images cover a diverse range of urban settlements, ranging from densely populated areas such as San Francisco's financial district to picturesque alpine towns like Lienz in Austrian Tyrol.\n\n- City-Based Split: Instead of merely dividing adjacent portions of the same images into the training and test subsets, this dataset adopts a unique approach. Different cities are included in each of the subsets. For instance, images from Chicago are part of the training set and excluded from the test set, while images from San Francisco are included in the test set and not in the training set. This design aims to assess the generalization capabilities of semantic labeling techniques across regions with varying illumination conditions, urban landscapes, and times of the year.\n\n- Data Sources: The dataset was meticulously constructed by combining publicly available imagery and official building footprints.\n\nThis additional information further enriches the understanding of the Inria Aerial Image Labeling Dataset and its potential applications in remote sensing research.\n\n\n\nIf you use the Inria Aerial Image Labeling Dataset dataset in your research, please consider citing the following publication or the dataset's official website:\n\n\n\nAID: A Benchmark Dataset for Performance Evaluation of Aerial Scene Classification" ]
[ "TAGS\n#task_categories-image-segmentation #size_categories-n<1K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-segmentation #semantic-segmentation #building-labeling #arxiv-1608.05167 #region-us \n", "# Inria Aerial Image Labeling Dataset\n\n!Inria Aerial Image Labeling", "## Description\n\nThe Inria Aerial Image Labeling Dataset is a building semantic segmentation dataset proposed in \"Can semantic labeling methods generalize to any city? the inria aerial image labeling benchmark,\" Maggiori et al.. It consists of 360 high-resolution (0.3m) RGB images, each with a size of 5000x5000 pixels. These images are extracted from various international GIS services, such as the USGS National Map.\n\nProject page: URL", "## Details", "## Structure", "### Statistics\n\n- Image Resolution: 0.3 meters per pixel\n- Image Size: 5000x5000 pixels\n- Total Images: 360\n- Regions: 10 regions around the world, including both urban and rural areas.\n- Split: Train and test sets are split into different cities for evaluating model generalization across dramatically different locations.\n- Test Set Ground Truth Masks: Note that the ground truth masks for the test set have not been publicly released.\n\nThe dataset was originally used in the Inria Aerial Image Labeling Dataset Contest.", "### About the Dataset\n\nThe Inria Aerial Image Labeling Dataset is a comprehensive resource for semantic segmentation tasks in the field of remote sensing, with additional information as follows:\n\n- Dataset Coverage: The dataset spans a total area of 810 km², meticulously divided into 405 km² for training and another 405 km² for testing purposes.\n\n- Image Characteristics: This dataset offers aerial orthorectified color imagery, capturing scenes at an impressive spatial resolution of 0.3 meters per pixel.\n\n- Semantic Classes: Ground truth data is provided for two fundamental semantic classes: \"building\" and \"not building.\" It's important to note that ground truth data for the \"not building\" class is publicly disclosed exclusively for the training subset.\n\n- Diverse Urban Settlements: The images cover a diverse range of urban settlements, ranging from densely populated areas such as San Francisco's financial district to picturesque alpine towns like Lienz in Austrian Tyrol.\n\n- City-Based Split: Instead of merely dividing adjacent portions of the same images into the training and test subsets, this dataset adopts a unique approach. Different cities are included in each of the subsets. For instance, images from Chicago are part of the training set and excluded from the test set, while images from San Francisco are included in the test set and not in the training set. This design aims to assess the generalization capabilities of semantic labeling techniques across regions with varying illumination conditions, urban landscapes, and times of the year.\n\n- Data Sources: The dataset was meticulously constructed by combining publicly available imagery and official building footprints.\n\nThis additional information further enriches the understanding of the Inria Aerial Image Labeling Dataset and its potential applications in remote sensing research.\n\n\n\nIf you use the Inria Aerial Image Labeling Dataset dataset in your research, please consider citing the following publication or the dataset's official website:\n\n\n\nAID: A Benchmark Dataset for Performance Evaluation of Aerial Scene Classification" ]
[ 90, 20, 107, 2, 4, 119, 478 ]
[ "passage: TAGS\n#task_categories-image-segmentation #size_categories-n<1K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-segmentation #semantic-segmentation #building-labeling #arxiv-1608.05167 #region-us \n# Inria Aerial Image Labeling Dataset\n\n!Inria Aerial Image Labeling## Description\n\nThe Inria Aerial Image Labeling Dataset is a building semantic segmentation dataset proposed in \"Can semantic labeling methods generalize to any city? the inria aerial image labeling benchmark,\" Maggiori et al.. It consists of 360 high-resolution (0.3m) RGB images, each with a size of 5000x5000 pixels. These images are extracted from various international GIS services, such as the USGS National Map.\n\nProject page: URL## Details## Structure### Statistics\n\n- Image Resolution: 0.3 meters per pixel\n- Image Size: 5000x5000 pixels\n- Total Images: 360\n- Regions: 10 regions around the world, including both urban and rural areas.\n- Split: Train and test sets are split into different cities for evaluating model generalization across dramatically different locations.\n- Test Set Ground Truth Masks: Note that the ground truth masks for the test set have not been publicly released.\n\nThe dataset was originally used in the Inria Aerial Image Labeling Dataset Contest." ]
a65024b6611b36cc8ecfbe461224ebdd95c0d7e5
# Danbooru Public Danbooru database which including metadata for ["posts"](https://danbooru.donmai.us/wiki_pages/help:posts), ["tags"](https://danbooru.donmai.us/wiki_pages/help:tags) and ["artists"](https://danbooru.donmai.us/artists). Download from [danbooru public google cloud storage](https://console.cloud.google.com/storage/browser/danbooru_public/data?project=danbooru1). Updated at 2023/11/30. Data are encoded with [JSON Lines](https://jsonlines.org/). ```bash tar -xJf tags.tar.xz tar -xJf artists.tar.xz # posts.tar.br is compressed with brotli # --use-compress-program might also works # please note that the output is near 20GB brotli --decompress --stdout posts.tar.br | tar -xf ``` ## See also - [crosstyan/explore-danbooru](https://github.com/crosstyan/explore-danbooru)
Crosstyan/danbooru-public
[ "danbooru", "region:us" ]
2023-12-04T13:18:14+00:00
{"tags": ["danbooru"]}
2024-01-21T14:56:29+00:00
[]
[]
TAGS #danbooru #region-us
# Danbooru Public Danbooru database which including metadata for "posts", "tags" and "artists". Download from danbooru public google cloud storage. Updated at 2023/11/30. Data are encoded with JSON Lines. ## See also - crosstyan/explore-danbooru
[ "# Danbooru Public\n\nDanbooru database which including metadata for \"posts\",\n\"tags\" and \"artists\".\n\nDownload from danbooru public google cloud storage.\nUpdated at 2023/11/30.\n\nData are encoded with JSON Lines.", "## See also\n\n- crosstyan/explore-danbooru" ]
[ "TAGS\n#danbooru #region-us \n", "# Danbooru Public\n\nDanbooru database which including metadata for \"posts\",\n\"tags\" and \"artists\".\n\nDownload from danbooru public google cloud storage.\nUpdated at 2023/11/30.\n\nData are encoded with JSON Lines.", "## See also\n\n- crosstyan/explore-danbooru" ]
[ 10, 54, 14 ]
[ "passage: TAGS\n#danbooru #region-us \n# Danbooru Public\n\nDanbooru database which including metadata for \"posts\",\n\"tags\" and \"artists\".\n\nDownload from danbooru public google cloud storage.\nUpdated at 2023/11/30.\n\nData are encoded with JSON Lines.## See also\n\n- crosstyan/explore-danbooru" ]
d422072bb3095f880c5ab062f2ca6facbbd77408
# Dataset Card for "thai-wiktionary-prompt" This dataset made to create prompt for get the meaning of word. Dataset from [Thai wiktionary](https://th.wiktionary.org/). ## Template ``` Inputs: จงบอกความหมายของคำต่อไปนี้: {word} Targets: คำว่า {word} มีความหมายตามชนิดของคำได้ดังต่อไปนี้\n - POS มีความหมายได้ว่า the meaning of word (split with "/") ``` Example: ``` Inputs: จงบอกความหมายของคำต่อไปนี้: ศึกษา Targets: คำว่า ศึกษา มีความหมายตามชนิดของคำได้ดังต่อไปนี้ - คำนาม มีความหมายได้ว่า การเล่าเรียน การฝึกฝน และการอบรม - คำกริยา มีความหมายได้ว่า เล่าเรียน ฝึกฝน และอบรม ``` PyThaiNLP
pythainlp/thai-wiktionary-prompt
[ "task_categories:text2text-generation", "task_categories:text-generation", "size_categories:10K<n<100K", "language:th", "license:cc-by-sa-3.0", "region:us" ]
2023-12-04T13:33:04+00:00
{"language": ["th"], "license": "cc-by-sa-3.0", "size_categories": ["10K<n<100K"], "task_categories": ["text2text-generation", "text-generation"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10265752, "num_examples": 19729}], "download_size": 2491006, "dataset_size": 10265752}}
2023-12-06T07:51:22+00:00
[]
[ "th" ]
TAGS #task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us
# Dataset Card for "thai-wiktionary-prompt" This dataset made to create prompt for get the meaning of word. Dataset from Thai wiktionary. ## Template Example: PyThaiNLP
[ "# Dataset Card for \"thai-wiktionary-prompt\"\n\nThis dataset made to create prompt for get the meaning of word.\n\nDataset from Thai wiktionary.", "## Template\n\n\nExample:\n\n\n\nPyThaiNLP" ]
[ "TAGS\n#task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us \n", "# Dataset Card for \"thai-wiktionary-prompt\"\n\nThis dataset made to create prompt for get the meaning of word.\n\nDataset from Thai wiktionary.", "## Template\n\n\nExample:\n\n\n\nPyThaiNLP" ]
[ 58, 35, 10 ]
[ "passage: TAGS\n#task_categories-text2text-generation #task_categories-text-generation #size_categories-10K<n<100K #language-Thai #license-cc-by-sa-3.0 #region-us \n# Dataset Card for \"thai-wiktionary-prompt\"\n\nThis dataset made to create prompt for get the meaning of word.\n\nDataset from Thai wiktionary.## Template\n\n\nExample:\n\n\n\nPyThaiNLP" ]
8147c8ffe2f6c5253a99a0d62d616787ddeb8a7d
# Dataset Card for "GPTProteinPretrained" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lamm-mit/GPTProteinPretrained
[ "region:us" ]
2023-12-04T13:48:11+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 266893442, "num_examples": 772027}], "download_size": 257822249, "dataset_size": 266893442}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-04T13:48:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for "GPTProteinPretrained" More Information needed
[ "# Dataset Card for \"GPTProteinPretrained\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"GPTProteinPretrained\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"GPTProteinPretrained\"\n\nMore Information needed" ]
d9b5531544523f29d365a77a007b2ace3cea6882
# Aerial Image Dataset (AID) ![AID](./thumbnail.png) ## Description The Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using [Google Earth](https://earth.google.com/web/) and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class. The new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes. The dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture. ## Details ## Structure ```tree . ├── README.md └── data    ├── Airport    │   ├── airport_1.png    │   ├── airport_2.png    │   ├── ...    │   └── airport_360.png    ├── BareLand    │   ├── bareland_1.png    │   ├── ...    │   └── bareland_310.png    ├── ...    └── Viaduct ``` ### Statistics - Total Number of Images: 10,000 - Image Resolution: 600x600 pixels - Scene Categories: 30 - Dataset Size: 2.6GB ## Citation If you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication: ```bibtex @article{xia2017aid, title = {AID: A benchmark data set for performance evaluation of aerial scene classification}, author = {Xia, Gui-Song and Hu, Jingwen and Hu, Fan and Shi, Baoguang and Bai, Xiang and Zhong, Yanfei and Zhang, Liangpei and Lu, Xiaoqiang}, journal = {IEEE Transactions on Geoscience and Remote Sensing}, volume = {55}, number = {7}, pages = {3965-3981}, year = {2017}, publisher = {IEEE} } ``` Paper with code: https://paperswithcode.com/dataset/aid
blanchon/AID
[ "task_categories:image-classification", "size_categories:1K<n<10K", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "region:us" ]
2023-12-04T13:57:13+00:00
{"language": ["en"], "license": ["unknown"], "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "pretty_name": "Aerial Image Dataset (AID)", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery"]}
2023-12-04T13:58:07+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #region-us
# Aerial Image Dataset (AID) !AID ## Description The Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using Google Earth and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class. The new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes. The dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture. ## Details ## Structure ### Statistics - Total Number of Images: 10,000 - Image Resolution: 600x600 pixels - Scene Categories: 30 - Dataset Size: 2.6GB If you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication: Paper with code: URL
[ "# Aerial Image Dataset (AID)\n\n!AID", "## Description\n\nThe Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using Google Earth and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class.\n\nThe new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes.\n\nThe dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 10,000\n- Image Resolution: 600x600 pixels\n- Scene Categories: 30\n- Dataset Size: 2.6GB\n\nIf you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication:\n\n\n\nPaper with code: URL" ]
[ "TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #region-us \n", "# Aerial Image Dataset (AID)\n\n!AID", "## Description\n\nThe Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using Google Earth and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class.\n\nThe new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes.\n\nThe dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 10,000\n- Image Resolution: 600x600 pixels\n- Scene Categories: 30\n- Dataset Size: 2.6GB\n\nIf you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication:\n\n\n\nPaper with code: URL" ]
[ 64, 14, 276, 2, 4, 64 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #region-us \n# Aerial Image Dataset (AID)\n\n!AID## Description\n\nThe Aerial Image Dataset (AID) is a scene classification dataset consisting of 10,000 RGB images, each with a resolution of 600x600 pixels. These images have been extracted using Google Earth and cover various scenes from regions and countries around the world. AID comprises 30 different scene categories, with several hundred images per class.\n\nThe new dataset is made up of the following 30 aerial scene types: airport, bare land, baseball field, beach, bridge, center, church, commercial, dense residential, desert, farmland, forest, industrial, meadow, medium residential, mountain, park, parking, playground, pond, port, railway station, resort, river, school, sparse residential, square, stadium, storage tanks and viaduct. All the images are labelled by the specialists in the field of remote sensing image interpretation, and some samples of each class are shown in Fig.1. In all, the AID dataset has a number of 10000 images within 30 classes.\n\nThe dataset is designed for the evaluation of aerial scene classification algorithms and models. It is considered a relatively easy dataset, with approximately 90% accuracy achievable using a VGG-16 architecture.## Details## Structure### Statistics\n- Total Number of Images: 10,000\n- Image Resolution: 600x600 pixels\n- Scene Categories: 30\n- Dataset Size: 2.6GB\n\nIf you use the Aerial Image Dataset (AID) in your research, please consider citing the following publication:\n\n\n\nPaper with code: URL" ]
921e207ea6aa26e9366fd200725a98b0067f9d6b
# ETCI 2021 Flood Detection Dataset ![ETCI 2021 Flood Detection](./thumbnail.jpg) ## Description The [ETCI 2021 Flood Detection Dataset](https://nasa-impact.github.io/etci2021/) is a comprehensive flood detection segmentation dataset that focuses on SAR (Synthetic Aperture Radar) images taken by the [ESA Sentinel-1 satellite](https://sentinel.esa.int/web/sentinel/missions/sentinel-1). This dataset provides pairs of VV (Vertical Transmit, Vertical Receive) and VH (Vertical Transmit, Horizontal Receive) polarization images, which have been processed by the Hybrid Pluggable Processing Pipeline (hyp3). Additionally, the dataset includes corresponding binary flood and water body ground truth masks. The ataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes. The ETCI 2021 dataset is valuable for flood detection and segmentation tasks and facilitates research and development in this domain. ## Details ## Structure ```tree . ├── README.md └── data    ├── test    │   ├── florence_20180510t231343    │   │   ├── tiles    │   │   │   ├── flood_label    │   │   │   │   ├── florence_20180510t231343_x-0_y-0_vv.png    │   │   │   │   └── ...    │   │   │   ├── vh    │   │   │   │   ├── florence_20180510t231343_x-0_y-0_vh.png    │   │   │   │   └── ...    │   │   │   ├── vv    │   │   │   │ ├── florence_20180510t231343_x-0_y-0_vv.png    │   │   │   │ └── ...    │   │   │   └── water_body_label    │   │   │   ├── florence_20180510t231343_x-0_y-0_vv.png    │   │   │   └── ...    │   │   └── ...    │   └── ...    ├── test_internal    │   └── ...    └── train       └── ... ``` ### Statistics - Total Number of Images: 66,810 - SAR Image Resolution: 256x256 pixels - Polarization: VV and VH - Ground Truth Masks: Binary flood and water body masks - Dataset Size: 5.6GB - File name prefix: `<region>_<datetime>*_x-*_y-*_<vv | vh>.png` ## Citation If you use the ETCI 2021 Flood Detection dataset in your research, please consider citing the following publication or the dataset's official website: https://nasa-impact.github.io/etci2021/
blanchon/ETCI-2021-Flood-Detection
[ "task_categories:image-segmentation", "size_categories:1M<n<10M", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "flood-detection", "sar-images", "sentinel-1", "region:us" ]
2023-12-04T14:07:18+00:00
{"language": ["en"], "license": ["unknown"], "size_categories": ["1M<n<10M"], "task_categories": ["image-segmentation"], "pretty_name": "ETCI 2021 Flood Detection Dataset", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "flood-detection", "sar-images", "sentinel-1"]}
2023-12-04T14:10:14+00:00
[]
[ "en" ]
TAGS #task_categories-image-segmentation #size_categories-1M<n<10M #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #flood-detection #sar-images #sentinel-1 #region-us
# ETCI 2021 Flood Detection Dataset !ETCI 2021 Flood Detection ## Description The ETCI 2021 Flood Detection Dataset is a comprehensive flood detection segmentation dataset that focuses on SAR (Synthetic Aperture Radar) images taken by the ESA Sentinel-1 satellite. This dataset provides pairs of VV (Vertical Transmit, Vertical Receive) and VH (Vertical Transmit, Horizontal Receive) polarization images, which have been processed by the Hybrid Pluggable Processing Pipeline (hyp3). Additionally, the dataset includes corresponding binary flood and water body ground truth masks. The ataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes. The ETCI 2021 dataset is valuable for flood detection and segmentation tasks and facilitates research and development in this domain. ## Details ## Structure ### Statistics - Total Number of Images: 66,810 - SAR Image Resolution: 256x256 pixels - Polarization: VV and VH - Ground Truth Masks: Binary flood and water body masks - Dataset Size: 5.6GB - File name prefix: '<region>_<datetime>*_x-*_y-*_<vv | vh>.png' If you use the ETCI 2021 Flood Detection dataset in your research, please consider citing the following publication or the dataset's official website: URL
[ "# ETCI 2021 Flood Detection Dataset\n\n!ETCI 2021 Flood Detection", "## Description\n\nThe ETCI 2021 Flood Detection Dataset is a comprehensive flood detection segmentation dataset that focuses on SAR (Synthetic Aperture Radar) images taken by the ESA Sentinel-1 satellite. This dataset provides pairs of VV (Vertical Transmit, Vertical Receive) and VH (Vertical Transmit, Horizontal Receive) polarization images, which have been processed by the Hybrid Pluggable Processing Pipeline (hyp3). Additionally, the dataset includes corresponding binary flood and water body ground truth masks.\n\nThe ataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes.\n\nThe ETCI 2021 dataset is valuable for flood detection and segmentation tasks and facilitates research and development in this domain.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 66,810\n- SAR Image Resolution: 256x256 pixels\n- Polarization: VV and VH\n- Ground Truth Masks: Binary flood and water body masks\n- Dataset Size: 5.6GB\n- File name prefix: '<region>_<datetime>*_x-*_y-*_<vv | vh>.png'\n\nIf you use the ETCI 2021 Flood Detection dataset in your research, please consider citing the following publication or the dataset's official website:\n\nURL" ]
[ "TAGS\n#task_categories-image-segmentation #size_categories-1M<n<10M #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #flood-detection #sar-images #sentinel-1 #region-us \n", "# ETCI 2021 Flood Detection Dataset\n\n!ETCI 2021 Flood Detection", "## Description\n\nThe ETCI 2021 Flood Detection Dataset is a comprehensive flood detection segmentation dataset that focuses on SAR (Synthetic Aperture Radar) images taken by the ESA Sentinel-1 satellite. This dataset provides pairs of VV (Vertical Transmit, Vertical Receive) and VH (Vertical Transmit, Horizontal Receive) polarization images, which have been processed by the Hybrid Pluggable Processing Pipeline (hyp3). Additionally, the dataset includes corresponding binary flood and water body ground truth masks.\n\nThe ataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes.\n\nThe ETCI 2021 dataset is valuable for flood detection and segmentation tasks and facilitates research and development in this domain.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 66,810\n- SAR Image Resolution: 256x256 pixels\n- Polarization: VV and VH\n- Ground Truth Masks: Binary flood and water body masks\n- Dataset Size: 5.6GB\n- File name prefix: '<region>_<datetime>*_x-*_y-*_<vv | vh>.png'\n\nIf you use the ETCI 2021 Flood Detection dataset in your research, please consider citing the following publication or the dataset's official website:\n\nURL" ]
[ 80, 18, 311, 2, 4, 123 ]
[ "passage: TAGS\n#task_categories-image-segmentation #size_categories-1M<n<10M #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #flood-detection #sar-images #sentinel-1 #region-us \n# ETCI 2021 Flood Detection Dataset\n\n!ETCI 2021 Flood Detection## Description\n\nThe ETCI 2021 Flood Detection Dataset is a comprehensive flood detection segmentation dataset that focuses on SAR (Synthetic Aperture Radar) images taken by the ESA Sentinel-1 satellite. This dataset provides pairs of VV (Vertical Transmit, Vertical Receive) and VH (Vertical Transmit, Horizontal Receive) polarization images, which have been processed by the Hybrid Pluggable Processing Pipeline (hyp3). Additionally, the dataset includes corresponding binary flood and water body ground truth masks.\n\nThe ataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes.\n\nThe ETCI 2021 dataset is valuable for flood detection and segmentation tasks and facilitates research and development in this domain.## Details## Structure" ]
3c412e84b75637887a7ab3bc4d9cf966ebc24b1c
# Dataset Card for "text-classification-subject" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dumyy/text-classification-subject
[ "region:us" ]
2023-12-04T14:10:44+00:00
{"dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "\u6587\u5b66", "1": "\u6570\u5b66", "2": "\u82f1\u6587", "3": "\u7269\u7406", "4": "\u751f\u7269", "5": "\u5316\u5b66"}}}}], "splits": [{"name": "train", "num_bytes": 319, "num_examples": 8}, {"name": "val", "num_bytes": 319, "num_examples": 8}, {"name": "test", "num_bytes": 319, "num_examples": 8}], "download_size": 0, "dataset_size": 957}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-07T16:55:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "text-classification-subject" More Information needed
[ "# Dataset Card for \"text-classification-subject\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"text-classification-subject\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"text-classification-subject\"\n\nMore Information needed" ]
10bcb2d6c9895db2f072f97058c4cd1e33c8f182
# Dataset Card for "train-indo-squad-v2-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
peterandrew987/train-indo-squad-v2-test
[ "task_categories:question-answering", "region:us" ]
2023-12-04T14:17:37+00:00
{"task_categories": ["question-answering"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}, {"name": "indonesian_answers", "struct": [{"name": "answer_end", "dtype": "int64"}, {"name": "answer_start", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}, {"name": "postags", "sequence": {"sequence": {"sequence": "string"}}}], "splits": [{"name": "train", "num_bytes": 13534958, "num_examples": 10925}], "download_size": 1875464, "dataset_size": 13534958}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-05T01:00:17+00:00
[]
[]
TAGS #task_categories-question-answering #region-us
# Dataset Card for "train-indo-squad-v2-test" More Information needed
[ "# Dataset Card for \"train-indo-squad-v2-test\"\n\nMore Information needed" ]
[ "TAGS\n#task_categories-question-answering #region-us \n", "# Dataset Card for \"train-indo-squad-v2-test\"\n\nMore Information needed" ]
[ 18, 21 ]
[ "passage: TAGS\n#task_categories-question-answering #region-us \n# Dataset Card for \"train-indo-squad-v2-test\"\n\nMore Information needed" ]
465fe09861ef08aa88d1b08680f121d00abf2bd2
# FAIR1M <!-- Dataset thumbnail --> ![FAIR1M](./thumbnail.jpg) <!-- Provide a quick summary of the dataset. --> The FAIR1M dataset is a fine-grained object recognition and detection dataset that focuses on high-resolution (0.3-0.8m) RGB images taken by the Gaogen (GF) satellites and extracted from Google Earth. It consists of a collection of 15,000 high-resolution images that cover various objects and scenes. The dataset provides annotations in the form of rotated bounding boxes for objects belonging to 5 main categories (ships, vehicles, airplanes, courts, and roads), further divided into 37 sub-categories. - **Paper:** https://arxiv.org/abs/2103.05569 - **Homepage:** https://www.gaofen-challenge.com/benchmark ## Description <!-- Provide a longer summary of what this dataset is. --> FAIR1M is a part of the ISPRS Benchmark on Object Detection in High-Resolution Satellite Images. Please note that, as of now, only a portion of the training dataset (1,732/15,000 images) has been released for the challenge. - **1 million object instances** - **Number of Samples**: 15000 - **Bands**: 3 (RGB) - **Image Size**: 1024x1024 - **Image Resolution**: 0.3–0.8m - **Land Cover Classes**: 37 - **Classes**: 5 object categories, 37 object sub-categories. - **Scene Categories**: Passenger Ship, Motorboat, Fishing Boat, Tugboat, other-ship, Engineering Ship, Liquid Cargo Ship, Dry Cargo Ship, Warship, Small Car, Bus, Cargo Truck, Dump Truck, other-vehicle, Van, Trailer, Tractor, Excavator, Truck Tractor, Boeing737, Boeing747, Boeing777, Boeing787, ARJ21, C919, A220, A321, A330, A350, other-airplane, Baseball Field, Basketball Court, Football Field, Tennis Court, Roundabout, Intersection, Bridge - **Source**: Gaofen/Google Earth ## Usage To use this dataset, simply use `datasets.load_dataset("blanchon/FAIR1M")`. <!-- Provide any additional information on how to use this dataset. --> ```python from datasets import load_dataset FAIR1M = load_dataset("blanchon/FAIR1M") ``` ## Citation <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> If you use the FAIR1M dataset in your research, please consider citing the following publication: ```bibtex @article{sun2021fair1m, title = {FAIR1M: A Benchmark Dataset for Fine-grained Object Recognition in High-Resolution Remote Sensing Imagery}, author = {Xian Sun and Peijin Wang and Zhiyuan Yan and F. Xu and Ruiping Wang and W. Diao and Jin Chen and Jihao Li and Yingchao Feng and Tao Xu and M. Weinmann and S. Hinz and Cheng Wang and K. Fu}, journal = {Isprs Journal of Photogrammetry and Remote Sensing}, year = {2021}, doi = {10.1016/j.isprsjprs.2021.12.004}, bibSource = {Semantic Scholar https://www.semanticscholar.org/paper/6d3c2dc63ff0deec10f60e5a515c93af4f8676f2} } ```
blanchon/FAIR1M
[ "task_categories:object-detection", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "object-detection", "arxiv:2103.05569", "region:us" ]
2023-12-04T14:22:26+00:00
{"language": "en", "license": "unknown", "task_categories": ["object-detection"], "paperswithcode_id": "FAIR1M", "pretty_name": "FAIR1M", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "object-detection"]}
2023-12-06T12:38:09+00:00
[ "2103.05569" ]
[ "en" ]
TAGS #task_categories-object-detection #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #object-detection #arxiv-2103.05569 #region-us
# FAIR1M !FAIR1M The FAIR1M dataset is a fine-grained object recognition and detection dataset that focuses on high-resolution (0.3-0.8m) RGB images taken by the Gaogen (GF) satellites and extracted from Google Earth. It consists of a collection of 15,000 high-resolution images that cover various objects and scenes. The dataset provides annotations in the form of rotated bounding boxes for objects belonging to 5 main categories (ships, vehicles, airplanes, courts, and roads), further divided into 37 sub-categories. - Paper: URL - Homepage: URL ## Description FAIR1M is a part of the ISPRS Benchmark on Object Detection in High-Resolution Satellite Images. Please note that, as of now, only a portion of the training dataset (1,732/15,000 images) has been released for the challenge. - 1 million object instances - Number of Samples: 15000 - Bands: 3 (RGB) - Image Size: 1024x1024 - Image Resolution: 0.3–0.8m - Land Cover Classes: 37 - Classes: 5 object categories, 37 object sub-categories. - Scene Categories: Passenger Ship, Motorboat, Fishing Boat, Tugboat, other-ship, Engineering Ship, Liquid Cargo Ship, Dry Cargo Ship, Warship, Small Car, Bus, Cargo Truck, Dump Truck, other-vehicle, Van, Trailer, Tractor, Excavator, Truck Tractor, Boeing737, Boeing747, Boeing777, Boeing787, ARJ21, C919, A220, A321, A330, A350, other-airplane, Baseball Field, Basketball Court, Football Field, Tennis Court, Roundabout, Intersection, Bridge - Source: Gaofen/Google Earth ## Usage To use this dataset, simply use 'datasets.load_dataset("blanchon/FAIR1M")'. If you use the FAIR1M dataset in your research, please consider citing the following publication:
[ "# FAIR1M\n\n\n!FAIR1M\n\n\nThe FAIR1M dataset is a fine-grained object recognition and detection dataset that focuses on high-resolution (0.3-0.8m) RGB images taken by the Gaogen (GF) satellites and extracted from Google Earth. It consists of a collection of 15,000 high-resolution images that cover various objects and scenes. The dataset provides annotations in the form of rotated bounding boxes for objects belonging to 5 main categories (ships, vehicles, airplanes, courts, and roads), further divided into 37 sub-categories.\n- Paper: URL\n- Homepage: URL", "## Description\n\n\n\nFAIR1M is a part of the ISPRS Benchmark on Object Detection in High-Resolution Satellite Images. Please note that, as of now, only a portion of the training dataset (1,732/15,000 images) has been released for the challenge.\n\n- 1 million object instances\n- Number of Samples: 15000\n- Bands: 3 (RGB)\n- Image Size: 1024x1024\n- Image Resolution: 0.3–0.8m\n- Land Cover Classes: 37\n- Classes: 5 object categories, 37 object sub-categories.\n- Scene Categories: Passenger Ship, Motorboat, Fishing Boat, Tugboat, other-ship, Engineering Ship, Liquid Cargo Ship, Dry Cargo Ship, Warship, Small Car, Bus, Cargo Truck, Dump Truck, other-vehicle, Van, Trailer, Tractor, Excavator, Truck Tractor, Boeing737, Boeing747, Boeing777, Boeing787, ARJ21, C919, A220, A321, A330, A350, other-airplane, Baseball Field, Basketball Court, Football Field, Tennis Court, Roundabout, Intersection, Bridge\n- Source: Gaofen/Google Earth", "## Usage\n\nTo use this dataset, simply use 'datasets.load_dataset(\"blanchon/FAIR1M\")'.\n\n\n\nIf you use the FAIR1M dataset in your research, please consider citing the following publication:" ]
[ "TAGS\n#task_categories-object-detection #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #object-detection #arxiv-2103.05569 #region-us \n", "# FAIR1M\n\n\n!FAIR1M\n\n\nThe FAIR1M dataset is a fine-grained object recognition and detection dataset that focuses on high-resolution (0.3-0.8m) RGB images taken by the Gaogen (GF) satellites and extracted from Google Earth. It consists of a collection of 15,000 high-resolution images that cover various objects and scenes. The dataset provides annotations in the form of rotated bounding boxes for objects belonging to 5 main categories (ships, vehicles, airplanes, courts, and roads), further divided into 37 sub-categories.\n- Paper: URL\n- Homepage: URL", "## Description\n\n\n\nFAIR1M is a part of the ISPRS Benchmark on Object Detection in High-Resolution Satellite Images. Please note that, as of now, only a portion of the training dataset (1,732/15,000 images) has been released for the challenge.\n\n- 1 million object instances\n- Number of Samples: 15000\n- Bands: 3 (RGB)\n- Image Size: 1024x1024\n- Image Resolution: 0.3–0.8m\n- Land Cover Classes: 37\n- Classes: 5 object categories, 37 object sub-categories.\n- Scene Categories: Passenger Ship, Motorboat, Fishing Boat, Tugboat, other-ship, Engineering Ship, Liquid Cargo Ship, Dry Cargo Ship, Warship, Small Car, Bus, Cargo Truck, Dump Truck, other-vehicle, Van, Trailer, Tractor, Excavator, Truck Tractor, Boeing737, Boeing747, Boeing777, Boeing787, ARJ21, C919, A220, A321, A330, A350, other-airplane, Baseball Field, Basketball Court, Football Field, Tennis Court, Roundabout, Intersection, Bridge\n- Source: Gaofen/Google Earth", "## Usage\n\nTo use this dataset, simply use 'datasets.load_dataset(\"blanchon/FAIR1M\")'.\n\n\n\nIf you use the FAIR1M dataset in your research, please consider citing the following publication:" ]
[ 65, 148, 269, 54 ]
[ "passage: TAGS\n#task_categories-object-detection #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #object-detection #arxiv-2103.05569 #region-us \n# FAIR1M\n\n\n!FAIR1M\n\n\nThe FAIR1M dataset is a fine-grained object recognition and detection dataset that focuses on high-resolution (0.3-0.8m) RGB images taken by the Gaogen (GF) satellites and extracted from Google Earth. It consists of a collection of 15,000 high-resolution images that cover various objects and scenes. The dataset provides annotations in the form of rotated bounding boxes for objects belonging to 5 main categories (ships, vehicles, airplanes, courts, and roads), further divided into 37 sub-categories.\n- Paper: URL\n- Homepage: URL## Description\n\n\n\nFAIR1M is a part of the ISPRS Benchmark on Object Detection in High-Resolution Satellite Images. Please note that, as of now, only a portion of the training dataset (1,732/15,000 images) has been released for the challenge.\n\n- 1 million object instances\n- Number of Samples: 15000\n- Bands: 3 (RGB)\n- Image Size: 1024x1024\n- Image Resolution: 0.3–0.8m\n- Land Cover Classes: 37\n- Classes: 5 object categories, 37 object sub-categories.\n- Scene Categories: Passenger Ship, Motorboat, Fishing Boat, Tugboat, other-ship, Engineering Ship, Liquid Cargo Ship, Dry Cargo Ship, Warship, Small Car, Bus, Cargo Truck, Dump Truck, other-vehicle, Van, Trailer, Tractor, Excavator, Truck Tractor, Boeing737, Boeing747, Boeing777, Boeing787, ARJ21, C919, A220, A321, A330, A350, other-airplane, Baseball Field, Basketball Court, Football Field, Tennis Court, Roundabout, Intersection, Bridge\n- Source: Gaofen/Google Earth" ]
894a80979aadeabf16ce408d0db7f6bd0dddfb1c
# Dataset Card for "token-classification-brand" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dumyy/token-classification-brand
[ "region:us" ]
2023-12-04T14:35:05+00:00
{"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "0", "1": "B-BRAND", "2": "I-BRAND"}}}}, {"name": "spans", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 462, "num_examples": 3}, {"name": "val", "num_bytes": 462, "num_examples": 3}], "download_size": 4670, "dataset_size": 924}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}]}
2023-12-07T16:55:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "token-classification-brand" More Information needed
[ "# Dataset Card for \"token-classification-brand\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"token-classification-brand\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"token-classification-brand\"\n\nMore Information needed" ]
f8fb2c69b80203f0b9a622f9fdb7f6a6001a1676
# Remote Sensing Image Scene Classification (RESISC45) Dataset - **Paper** [Remote Sensing Image Scene Classification: Benchmark and State of the Art ](https://arxiv.org/pdf/1703.00121.pdf) - **Paper with code**: [RESISC45](https://paperswithcode.com/dataset/resisc45) ![RESISC45](./thumbnail.png) ## Description The RESISC45 dataset is a scene classification dataset that focuses on RGB images extracted using [Google Earth](https://earth.google.com/web/). This dataset comprises a total of 31,500 images, with each image having a resolution of 256x256 pixels. RESISC45 contains 45 different scene classes, with 700 images per class. These images are collected from over 100 countries and were specifically selected to optimize for high variability in image conditions, including spatial resolution, occlusion, weather, illumination, and more. Among its notable features, RESISC45 contains varying spatial resolution ranging from 20cm to more than 30m/px. ## Details ## Structure ```tree . ├── README.md └── data    ├── airplane    │   ├── airplane_1.jpg    │   ├── ...    │   └── airplane_700.jpg    ├── airport    ├── baseball_diamond    ├── beach    ├── ...    └── wetland ``` ### Statistics - Total Number of Images: 31,500 - Image Resolution: 256x256 pixels - Scene Categories: 45 - Dataset Size: Approximately 0.47GB ## Citation If you use the RESISC45 dataset in your research, please consider citing the following publication or the dataset's official website: ```bibtex @article{cheng2017remote, title = {Remote sensing image scene classification: Benchmark and state of the art}, author = {Cheng, Gong and Han, Junwei and Lu, Xiaoqiang}, journal = {Proceedings of the IEEE}, volume = {105}, number = {10}, pages = {1865-1883}, year = {2017}, publisher = {IEEE} } ```
blanchon/RESISC45
[ "task_categories:image-classification", "size_categories:n<1G", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-classification", "arxiv:1703.00121", "region:us" ]
2023-12-04T14:55:30+00:00
{"language": ["en"], "license": ["unknown"], "size_categories": ["n<1G"], "task_categories": ["image-classification"], "pretty_name": "RESISC45 Dataset", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-classification"]}
2023-12-04T16:24:49+00:00
[ "1703.00121" ]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-n<1G #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #arxiv-1703.00121 #region-us
# Remote Sensing Image Scene Classification (RESISC45) Dataset - Paper Remote Sensing Image Scene Classification: Benchmark and State of the Art - Paper with code: RESISC45 !RESISC45 ## Description The RESISC45 dataset is a scene classification dataset that focuses on RGB images extracted using Google Earth. This dataset comprises a total of 31,500 images, with each image having a resolution of 256x256 pixels. RESISC45 contains 45 different scene classes, with 700 images per class. These images are collected from over 100 countries and were specifically selected to optimize for high variability in image conditions, including spatial resolution, occlusion, weather, illumination, and more. Among its notable features, RESISC45 contains varying spatial resolution ranging from 20cm to more than 30m/px. ## Details ## Structure ### Statistics - Total Number of Images: 31,500 - Image Resolution: 256x256 pixels - Scene Categories: 45 - Dataset Size: Approximately 0.47GB If you use the RESISC45 dataset in your research, please consider citing the following publication or the dataset's official website:
[ "# Remote Sensing Image Scene Classification (RESISC45) Dataset\n\n- Paper Remote Sensing Image Scene Classification: Benchmark and State of the Art\n\n- Paper with code: RESISC45\n\n!RESISC45", "## Description\n\nThe RESISC45 dataset is a scene classification dataset that focuses on RGB images extracted using Google Earth. This dataset comprises a total of 31,500 images, with each image having a resolution of 256x256 pixels. RESISC45 contains 45 different scene classes, with 700 images per class. These images are collected from over 100 countries and were specifically selected to optimize for high variability in image conditions, including spatial resolution, occlusion, weather, illumination, and more. Among its notable features, RESISC45 contains varying spatial resolution ranging from 20cm to more than 30m/px.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 31,500\n- Image Resolution: 256x256 pixels\n- Scene Categories: 45\n- Dataset Size: Approximately 0.47GB\n\nIf you use the RESISC45 dataset in your research, please consider citing the following publication or the dataset's official website:" ]
[ "TAGS\n#task_categories-image-classification #size_categories-n<1G #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #arxiv-1703.00121 #region-us \n", "# Remote Sensing Image Scene Classification (RESISC45) Dataset\n\n- Paper Remote Sensing Image Scene Classification: Benchmark and State of the Art\n\n- Paper with code: RESISC45\n\n!RESISC45", "## Description\n\nThe RESISC45 dataset is a scene classification dataset that focuses on RGB images extracted using Google Earth. This dataset comprises a total of 31,500 images, with each image having a resolution of 256x256 pixels. RESISC45 contains 45 different scene classes, with 700 images per class. These images are collected from over 100 countries and were specifically selected to optimize for high variability in image conditions, including spatial resolution, occlusion, weather, illumination, and more. Among its notable features, RESISC45 contains varying spatial resolution ranging from 20cm to more than 30m/px.", "## Details", "## Structure", "### Statistics\n- Total Number of Images: 31,500\n- Image Resolution: 256x256 pixels\n- Scene Categories: 45\n- Dataset Size: Approximately 0.47GB\n\nIf you use the RESISC45 dataset in your research, please consider citing the following publication or the dataset's official website:" ]
[ 76, 47, 143, 2, 4, 70 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-n<1G #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #arxiv-1703.00121 #region-us \n# Remote Sensing Image Scene Classification (RESISC45) Dataset\n\n- Paper Remote Sensing Image Scene Classification: Benchmark and State of the Art\n\n- Paper with code: RESISC45\n\n!RESISC45## Description\n\nThe RESISC45 dataset is a scene classification dataset that focuses on RGB images extracted using Google Earth. This dataset comprises a total of 31,500 images, with each image having a resolution of 256x256 pixels. RESISC45 contains 45 different scene classes, with 700 images per class. These images are collected from over 100 countries and were specifically selected to optimize for high variability in image conditions, including spatial resolution, occlusion, weather, illumination, and more. Among its notable features, RESISC45 contains varying spatial resolution ranging from 20cm to more than 30m/px.## Details## Structure### Statistics\n- Total Number of Images: 31,500\n- Image Resolution: 256x256 pixels\n- Scene Categories: 45\n- Dataset Size: Approximately 0.47GB\n\nIf you use the RESISC45 dataset in your research, please consider citing the following publication or the dataset's official website:" ]
b547ed4e1f84b2a0785d20668b282e24ce5fe389
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-40b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:00:33.518629](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b/blob/main/results_2023-12-04T15-00-33.518629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.354049089015047, "acc_stderr": 0.033417526940887884, "acc_norm": 0.35741274788934346, "acc_norm_stderr": 0.03422297037663694, "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396728, "mc2": 0.3752465265593006, "mc2_stderr": 0.013533322814931005 }, "harness|arc:challenge|25": { "acc": 0.40102389078498296, "acc_stderr": 0.01432225579071987, "acc_norm": 0.4300341296928328, "acc_norm_stderr": 0.014467631559137993 }, "harness|hellaswag|10": { "acc": 0.535749850627365, "acc_stderr": 0.004977010670436551, "acc_norm": 0.7236606253734316, "acc_norm_stderr": 0.004462727543055892 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4, "acc_stderr": 0.04232073695151589, "acc_norm": 0.4, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2631578947368421, "acc_stderr": 0.03583496176361063, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.03583496176361063 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3471698113207547, "acc_stderr": 0.029300101705549655, "acc_norm": 0.3471698113207547, "acc_norm_stderr": 0.029300101705549655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3541666666666667, "acc_stderr": 0.039994111357535424, "acc_norm": 0.3541666666666667, "acc_norm_stderr": 0.039994111357535424 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.0498887651569859, "acc_norm": 0.44, "acc_norm_stderr": 0.0498887651569859 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.28901734104046245, "acc_stderr": 0.034564257450869995, "acc_norm": 0.28901734104046245, "acc_norm_stderr": 0.034564257450869995 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.1568627450980392, "acc_stderr": 0.03618664819936245, "acc_norm": 0.1568627450980392, "acc_norm_stderr": 0.03618664819936245 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.31063829787234043, "acc_stderr": 0.03025123757921317, "acc_norm": 0.31063829787234043, "acc_norm_stderr": 0.03025123757921317 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21929824561403508, "acc_stderr": 0.038924311065187546, "acc_norm": 0.21929824561403508, "acc_norm_stderr": 0.038924311065187546 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3724137931034483, "acc_stderr": 0.04028731532947559, "acc_norm": 0.3724137931034483, "acc_norm_stderr": 0.04028731532947559 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2619047619047619, "acc_stderr": 0.022644212615525214, "acc_norm": 0.2619047619047619, "acc_norm_stderr": 0.022644212615525214 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574924, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574924 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.36129032258064514, "acc_stderr": 0.02732754844795754, "acc_norm": 0.36129032258064514, "acc_norm_stderr": 0.02732754844795754 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2561576354679803, "acc_stderr": 0.0307127300709826, "acc_norm": 0.2561576354679803, "acc_norm_stderr": 0.0307127300709826 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.503030303030303, "acc_stderr": 0.03904272341431857, "acc_norm": 0.503030303030303, "acc_norm_stderr": 0.03904272341431857 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3434343434343434, "acc_stderr": 0.03383201223244442, "acc_norm": 0.3434343434343434, "acc_norm_stderr": 0.03383201223244442 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.41450777202072536, "acc_stderr": 0.035553003195576735, "acc_norm": 0.41450777202072536, "acc_norm_stderr": 0.035553003195576735 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.28717948717948716, "acc_stderr": 0.02293992541853062, "acc_norm": 0.28717948717948716, "acc_norm_stderr": 0.02293992541853062 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.02646611753895992, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.02646611753895992 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3067226890756303, "acc_stderr": 0.02995382389188704, "acc_norm": 0.3067226890756303, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696525, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.42752293577981654, "acc_stderr": 0.02121091020430043, "acc_norm": 0.42752293577981654, "acc_norm_stderr": 0.02121091020430043 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.17592592592592593, "acc_stderr": 0.02596742095825853, "acc_norm": 0.17592592592592593, "acc_norm_stderr": 0.02596742095825853 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.47549019607843135, "acc_stderr": 0.03505093194348798, "acc_norm": 0.47549019607843135, "acc_norm_stderr": 0.03505093194348798 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.4767932489451477, "acc_stderr": 0.032512152011410174, "acc_norm": 0.4767932489451477, "acc_norm_stderr": 0.032512152011410174 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.32286995515695066, "acc_stderr": 0.031381476375754995, "acc_norm": 0.32286995515695066, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.45038167938931295, "acc_stderr": 0.04363643698524779, "acc_norm": 0.45038167938931295, "acc_norm_stderr": 0.04363643698524779 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.0453793517794788, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.0453793517794788 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4074074074074074, "acc_stderr": 0.04750077341199986, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.04750077341199986 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.294478527607362, "acc_stderr": 0.03581165790474082, "acc_norm": 0.294478527607362, "acc_norm_stderr": 0.03581165790474082 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755803, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755803 }, "harness|hendrycksTest-management|5": { "acc": 0.4563106796116505, "acc_stderr": 0.049318019942204146, "acc_norm": 0.4563106796116505, "acc_norm_stderr": 0.049318019942204146 }, "harness|hendrycksTest-marketing|5": { "acc": 0.4188034188034188, "acc_stderr": 0.03232128912157792, "acc_norm": 0.4188034188034188, "acc_norm_stderr": 0.03232128912157792 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4329501915708812, "acc_stderr": 0.017718469101513982, "acc_norm": 0.4329501915708812, "acc_norm_stderr": 0.017718469101513982 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.3265895953757225, "acc_stderr": 0.025248264774242826, "acc_norm": 0.3265895953757225, "acc_norm_stderr": 0.025248264774242826 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249588, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249588 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3464052287581699, "acc_stderr": 0.02724561304721536, "acc_norm": 0.3464052287581699, "acc_norm_stderr": 0.02724561304721536 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3504823151125402, "acc_stderr": 0.027098652621301754, "acc_norm": 0.3504823151125402, "acc_norm_stderr": 0.027098652621301754 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.37037037037037035, "acc_stderr": 0.026869490744815257, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.026869490744815257 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.25886524822695034, "acc_stderr": 0.026129572527180848, "acc_norm": 0.25886524822695034, "acc_norm_stderr": 0.026129572527180848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.33702737940026073, "acc_stderr": 0.012072836273691327, "acc_norm": 0.33702737940026073, "acc_norm_stderr": 0.012072836273691327 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2536764705882353, "acc_stderr": 0.02643132987078954, "acc_norm": 0.2536764705882353, "acc_norm_stderr": 0.02643132987078954 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.34967320261437906, "acc_stderr": 0.01929196189506638, "acc_norm": 0.34967320261437906, "acc_norm_stderr": 0.01929196189506638 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.43636363636363634, "acc_stderr": 0.04750185058907297, "acc_norm": 0.43636363636363634, "acc_norm_stderr": 0.04750185058907297 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.47346938775510206, "acc_stderr": 0.03196412734523272, "acc_norm": 0.47346938775510206, "acc_norm_stderr": 0.03196412734523272 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4427860696517413, "acc_stderr": 0.03512310964123937, "acc_norm": 0.4427860696517413, "acc_norm_stderr": 0.03512310964123937 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-virology|5": { "acc": 0.3493975903614458, "acc_stderr": 0.0371172519074075, "acc_norm": 0.3493975903614458, "acc_norm_stderr": 0.0371172519074075 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.47953216374269003, "acc_stderr": 0.038316105328219316, "acc_norm": 0.47953216374269003, "acc_norm_stderr": 0.038316105328219316 }, "harness|truthfulqa:mc|0": { "mc1": 0.22276621787025705, "mc1_stderr": 0.014566506961396728, "mc2": 0.3752465265593006, "mc2_stderr": 0.013533322814931005 }, "harness|winogrande|5": { "acc": 0.6795580110497238, "acc_stderr": 0.013115085457681709 }, "harness|gsm8k|5": { "acc": 0.04700530705079606, "acc_stderr": 0.005829898355937188 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b
[ "region:us" ]
2023-12-04T15:02:48+00:00
{"pretty_name": "Evaluation run of AI-Sweden-Models/gpt-sw3-40b", "dataset_summary": "Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-40b](https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:00:33.518629](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-40b/blob/main/results_2023-12-04T15-00-33.518629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.354049089015047,\n \"acc_stderr\": 0.033417526940887884,\n \"acc_norm\": 0.35741274788934346,\n \"acc_norm_stderr\": 0.03422297037663694,\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396728,\n \"mc2\": 0.3752465265593006,\n \"mc2_stderr\": 0.013533322814931005\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.40102389078498296,\n \"acc_stderr\": 0.01432225579071987,\n \"acc_norm\": 0.4300341296928328,\n \"acc_norm_stderr\": 0.014467631559137993\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.535749850627365,\n \"acc_stderr\": 0.004977010670436551,\n \"acc_norm\": 0.7236606253734316,\n \"acc_norm_stderr\": 0.004462727543055892\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361063,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361063\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3471698113207547,\n \"acc_stderr\": 0.029300101705549655,\n \"acc_norm\": 0.3471698113207547,\n \"acc_norm_stderr\": 0.029300101705549655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.03618664819936245,\n \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.03618664819936245\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.038924311065187546,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.038924311065187546\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947559,\n \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947559\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431857,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431857\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3434343434343434,\n \"acc_stderr\": 0.03383201223244442,\n \"acc_norm\": 0.3434343434343434,\n \"acc_norm_stderr\": 0.03383201223244442\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.035553003195576735,\n \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.035553003195576735\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.02293992541853062,\n \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.02293992541853062\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.42752293577981654,\n \"acc_stderr\": 0.02121091020430043,\n \"acc_norm\": 0.42752293577981654,\n \"acc_norm_stderr\": 0.02121091020430043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.03505093194348798,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.03505093194348798\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4767932489451477,\n \"acc_stderr\": 0.032512152011410174,\n \"acc_norm\": 0.4767932489451477,\n \"acc_norm_stderr\": 0.032512152011410174\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755803,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755803\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4188034188034188,\n \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.4188034188034188,\n \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4329501915708812,\n \"acc_stderr\": 0.017718469101513982,\n \"acc_norm\": 0.4329501915708812,\n \"acc_norm_stderr\": 0.017718469101513982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3265895953757225,\n \"acc_stderr\": 0.025248264774242826,\n \"acc_norm\": 0.3265895953757225,\n \"acc_norm_stderr\": 0.025248264774242826\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3464052287581699,\n \"acc_stderr\": 0.02724561304721536,\n \"acc_norm\": 0.3464052287581699,\n \"acc_norm_stderr\": 0.02724561304721536\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3504823151125402,\n \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.3504823151125402,\n \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.026869490744815257,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.026869490744815257\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33702737940026073,\n \"acc_stderr\": 0.012072836273691327,\n \"acc_norm\": 0.33702737940026073,\n \"acc_norm_stderr\": 0.012072836273691327\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.02643132987078954,\n \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.02643132987078954\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.34967320261437906,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4427860696517413,\n \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.4427860696517413,\n \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396728,\n \"mc2\": 0.3752465265593006,\n \"mc2_stderr\": 0.013533322814931005\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681709\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.005829898355937188\n }\n}\n```", "repo_url": "https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["**/details_harness|winogrande|5_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-00-33.518629.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_00_33.518629", "path": ["results_2023-12-04T15-00-33.518629.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-00-33.518629.parquet"]}]}]}
2023-12-04T15:03:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-40b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:00:33.518629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-40b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:00:33.518629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-40b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:00:33.518629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-40b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AI-Sweden-Models/gpt-sw3-40b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:00:33.518629(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
4df51d8fe677a8306a32227ed29218898b693ede
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-1.3b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:02:34.832979](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct/blob/main/results_2023-12-04T15-02-34.832979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.28454514557458804, "acc_stderr": 0.031975322223722284, "acc_norm": 0.2866992057773824, "acc_norm_stderr": 0.03278367939837994, "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557989, "mc2": 0.44015243507625756, "mc2_stderr": 0.015219908561861553 }, "harness|arc:challenge|25": { "acc": 0.257679180887372, "acc_stderr": 0.0127807705627684, "acc_norm": 0.2858361774744027, "acc_norm_stderr": 0.013203196088537369 }, "harness|hellaswag|10": { "acc": 0.33419637522405893, "acc_stderr": 0.004707447244200623, "acc_norm": 0.398725353515236, "acc_norm_stderr": 0.0048863535635718545 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03944624162501117, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03944624162501117 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.19736842105263158, "acc_stderr": 0.03238981601699397, "acc_norm": 0.19736842105263158, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3018867924528302, "acc_stderr": 0.02825420034443865, "acc_norm": 0.3018867924528302, "acc_norm_stderr": 0.02825420034443865 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720683, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.23, "acc_stderr": 0.042295258468165044, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165044 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.24277456647398843, "acc_stderr": 0.0326926380614177, "acc_norm": 0.24277456647398843, "acc_norm_stderr": 0.0326926380614177 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.043364327079931785, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.043364327079931785 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.31063829787234043, "acc_stderr": 0.03025123757921317, "acc_norm": 0.31063829787234043, "acc_norm_stderr": 0.03025123757921317 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.040493392977481425, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.040493392977481425 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.38620689655172413, "acc_stderr": 0.04057324734419035, "acc_norm": 0.38620689655172413, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.0225698970749184, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.0225698970749184 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.21428571428571427, "acc_stderr": 0.03670066451047181, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.03670066451047181 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2806451612903226, "acc_stderr": 0.025560604721022884, "acc_norm": 0.2806451612903226, "acc_norm_stderr": 0.025560604721022884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.27586206896551724, "acc_stderr": 0.031447125816782426, "acc_norm": 0.27586206896551724, "acc_norm_stderr": 0.031447125816782426 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2909090909090909, "acc_stderr": 0.03546563019624336, "acc_norm": 0.2909090909090909, "acc_norm_stderr": 0.03546563019624336 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.35858585858585856, "acc_stderr": 0.03416903640391521, "acc_norm": 0.35858585858585856, "acc_norm_stderr": 0.03416903640391521 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.32642487046632124, "acc_stderr": 0.033840286211432945, "acc_norm": 0.32642487046632124, "acc_norm_stderr": 0.033840286211432945 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2692307692307692, "acc_stderr": 0.022489389793654824, "acc_norm": 0.2692307692307692, "acc_norm_stderr": 0.022489389793654824 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24444444444444444, "acc_stderr": 0.026202766534652148, "acc_norm": 0.24444444444444444, "acc_norm_stderr": 0.026202766534652148 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.2605042016806723, "acc_stderr": 0.028510251512341933, "acc_norm": 0.2605042016806723, "acc_norm_stderr": 0.028510251512341933 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3174311926605505, "acc_stderr": 0.0199571521984605, "acc_norm": 0.3174311926605505, "acc_norm_stderr": 0.0199571521984605 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.25462962962962965, "acc_stderr": 0.02971127586000536, "acc_norm": 0.25462962962962965, "acc_norm_stderr": 0.02971127586000536 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.2549019607843137, "acc_stderr": 0.030587591351604257, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.030587591351604257 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3080168776371308, "acc_stderr": 0.0300523893356057, "acc_norm": 0.3080168776371308, "acc_norm_stderr": 0.0300523893356057 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.14798206278026907, "acc_stderr": 0.023831557157613537, "acc_norm": 0.14798206278026907, "acc_norm_stderr": 0.023831557157613537 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3511450381679389, "acc_stderr": 0.04186445163013751, "acc_norm": 0.3511450381679389, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2892561983471074, "acc_stderr": 0.04139112727635464, "acc_norm": 0.2892561983471074, "acc_norm_stderr": 0.04139112727635464 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.035590395316173425, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.04432804055291519, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.04432804055291519 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258973, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258973 }, "harness|hendrycksTest-marketing|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03088273697413866, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03088273697413866 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.29246487867177523, "acc_stderr": 0.01626700068459864, "acc_norm": 0.29246487867177523, "acc_norm_stderr": 0.01626700068459864 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.28034682080924855, "acc_stderr": 0.024182427496577605, "acc_norm": 0.28034682080924855, "acc_norm_stderr": 0.024182427496577605 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.014893391735249588, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.014893391735249588 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.30392156862745096, "acc_stderr": 0.026336613469046637, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.026336613469046637 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.29260450160771706, "acc_stderr": 0.025839898334877983, "acc_norm": 0.29260450160771706, "acc_norm_stderr": 0.025839898334877983 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.28703703703703703, "acc_stderr": 0.02517104191530968, "acc_norm": 0.28703703703703703, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24468085106382978, "acc_stderr": 0.025645553622266722, "acc_norm": 0.24468085106382978, "acc_norm_stderr": 0.025645553622266722 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2900912646675359, "acc_stderr": 0.011590375554733095, "acc_norm": 0.2900912646675359, "acc_norm_stderr": 0.011590375554733095 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41544117647058826, "acc_stderr": 0.029935342707877746, "acc_norm": 0.41544117647058826, "acc_norm_stderr": 0.029935342707877746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.24183006535947713, "acc_stderr": 0.017322789207784326, "acc_norm": 0.24183006535947713, "acc_norm_stderr": 0.017322789207784326 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.33636363636363636, "acc_stderr": 0.04525393596302506, "acc_norm": 0.33636363636363636, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2530612244897959, "acc_stderr": 0.027833023871399694, "acc_norm": 0.2530612244897959, "acc_norm_stderr": 0.027833023871399694 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3383084577114428, "acc_stderr": 0.03345563070339193, "acc_norm": 0.3383084577114428, "acc_norm_stderr": 0.03345563070339193 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-virology|5": { "acc": 0.25301204819277107, "acc_stderr": 0.03384429155233137, "acc_norm": 0.25301204819277107, "acc_norm_stderr": 0.03384429155233137 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.21637426900584794, "acc_stderr": 0.03158149539338734, "acc_norm": 0.21637426900584794, "acc_norm_stderr": 0.03158149539338734 }, "harness|truthfulqa:mc|0": { "mc1": 0.2594859241126071, "mc1_stderr": 0.015345409485557989, "mc2": 0.44015243507625756, "mc2_stderr": 0.015219908561861553 }, "harness|winogrande|5": { "acc": 0.5240726124704025, "acc_stderr": 0.01403618966539513 }, "harness|gsm8k|5": { "acc": 0.01061410159211524, "acc_stderr": 0.002822713322387704 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct
[ "region:us" ]
2023-12-04T15:05:24+00:00
{"pretty_name": "Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-1.3b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:02:34.832979](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-1.3b-instruct/blob/main/results_2023-12-04T15-02-34.832979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28454514557458804,\n \"acc_stderr\": 0.031975322223722284,\n \"acc_norm\": 0.2866992057773824,\n \"acc_norm_stderr\": 0.03278367939837994,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.44015243507625756,\n \"mc2_stderr\": 0.015219908561861553\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.257679180887372,\n \"acc_stderr\": 0.0127807705627684,\n \"acc_norm\": 0.2858361774744027,\n \"acc_norm_stderr\": 0.013203196088537369\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33419637522405893,\n \"acc_stderr\": 0.004707447244200623,\n \"acc_norm\": 0.398725353515236,\n \"acc_norm_stderr\": 0.0048863535635718545\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501117,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501117\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.02825420034443865,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.02825420034443865\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.0225698970749184,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.0225698970749184\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.2806451612903226,\n \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.32642487046632124,\n \"acc_stderr\": 0.033840286211432945,\n \"acc_norm\": 0.32642487046632124,\n \"acc_norm_stderr\": 0.033840286211432945\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3174311926605505,\n \"acc_stderr\": 0.0199571521984605,\n \"acc_norm\": 0.3174311926605505,\n \"acc_norm_stderr\": 0.0199571521984605\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25462962962962965,\n \"acc_stderr\": 0.02971127586000536,\n \"acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.02971127586000536\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3080168776371308,\n \"acc_stderr\": 0.0300523893356057,\n \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.0300523893356057\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14798206278026907,\n \"acc_stderr\": 0.023831557157613537,\n \"acc_norm\": 0.14798206278026907,\n \"acc_norm_stderr\": 0.023831557157613537\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635464,\n \"acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635464\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258973,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n \"acc_stderr\": 0.01626700068459864,\n \"acc_norm\": 0.29246487867177523,\n \"acc_norm_stderr\": 0.01626700068459864\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.026336613469046637,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.026336613469046637\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266722,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266722\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2900912646675359,\n \"acc_stderr\": 0.011590375554733095,\n \"acc_norm\": 0.2900912646675359,\n \"acc_norm_stderr\": 0.011590375554733095\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399694,\n \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399694\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3383084577114428,\n \"acc_stderr\": 0.03345563070339193,\n \"acc_norm\": 0.3383084577114428,\n \"acc_norm_stderr\": 0.03345563070339193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.03384429155233137,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.03384429155233137\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.44015243507625756,\n \"mc2_stderr\": 0.015219908561861553\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.01403618966539513\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["**/details_harness|winogrande|5_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-02-34.832979.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_02_34.832979", "path": ["results_2023-12-04T15-02-34.832979.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-02-34.832979.parquet"]}]}]}
2023-12-04T15:06:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-1.3b-instruct on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:02:34.832979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-1.3b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:02:34.832979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-1.3b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:02:34.832979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 176, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-1.3b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-1.3b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:02:34.832979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
a89037e45136ae407e6f670c026eb840b182b12e
# Dataset Card for Control-CelebA-HQ ## Overview **Dataset Name**: Control-CelebA-HQ **Description**: An enhanced version of the CelebA-HQ dataset, Control-CelebA-HQ is specifically designed for evaluating the controlling ability of controllable generative models. This dataset is featured in the NeurIPS 2023 work titled "Controlling Text-to-Image Diffusion by Orthogonal Finetuning (OFT)", and is pivotal in evaluating the control ability of the controllable generative models. **Dataset Type**: Generative Model, Controllable Generation, PEFT **Official Page**: https://oft.wyliu.com/ ## Dataset Structure **Data Format**: Images with paired facial landmarks **Size**: Training set - 29.5k images; Testing set - 500 images **Resolution**: High Quality (CelebA-HQ standard) **Attributes**: Facial features with color-coded facial landmarks for controllable generation ## Data Collection and Preparation **Source**: Derived from the CelebA-HQ dataset **Collection Method**: Original CelebA-HQ images processed with a standard face alignment tracker (available at https://github.com/1adrianb/face-alignment) for facial landmark detection **Data Split**: 29.5k images for training, 500 images for testing ## Dataset Use and Access **Recommended Uses**: Training and testing controllable generative models, particularly in the context of facial image generation with landmark-based control **User Guidelines**: To use the dataset, train models on the training set using facial landmarks as control signals. For testing, generate images with landmarks as control and evaluate control consistency error between input and generated image's landmarks. Please cite the OFT paper when using this dataset and protocol. **Note:** Example usage and evaluation script will come out soon in Huggingface PEFT and Diffusers example. Stay tuned:D **Citation**: ``` @InProceedings{Qiu2023OFT, title={Controlling Text-to-Image Diffusion by Orthogonal Finetuning}, author={Qiu, Zeju and Liu, Weiyang and Feng, Haiwen and Xue, Yuxuan and Feng, Yao and Liu, Zhen and Zhang, Dan and Weller, Adrian and Schölkopf, Bernhard}, booktitle={NeurIPS}, year={2023} } ```
oftverse/control-celeba-hq
[ "license:mit", "region:us" ]
2023-12-04T15:15:03+00:00
{"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "conditioning_image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1087163166.621, "num_examples": 29487}, {"name": "test", "num_bytes": 18131154.0, "num_examples": 500}], "download_size": 1089858259, "dataset_size": 1105294320.621}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2023-12-08T14:13:39+00:00
[]
[]
TAGS #license-mit #region-us
# Dataset Card for Control-CelebA-HQ ## Overview Dataset Name: Control-CelebA-HQ Description: An enhanced version of the CelebA-HQ dataset, Control-CelebA-HQ is specifically designed for evaluating the controlling ability of controllable generative models. This dataset is featured in the NeurIPS 2023 work titled "Controlling Text-to-Image Diffusion by Orthogonal Finetuning (OFT)", and is pivotal in evaluating the control ability of the controllable generative models. Dataset Type: Generative Model, Controllable Generation, PEFT Official Page: URL ## Dataset Structure Data Format: Images with paired facial landmarks Size: Training set - 29.5k images; Testing set - 500 images Resolution: High Quality (CelebA-HQ standard) Attributes: Facial features with color-coded facial landmarks for controllable generation ## Data Collection and Preparation Source: Derived from the CelebA-HQ dataset Collection Method: Original CelebA-HQ images processed with a standard face alignment tracker (available at URL for facial landmark detection Data Split: 29.5k images for training, 500 images for testing ## Dataset Use and Access Recommended Uses: Training and testing controllable generative models, particularly in the context of facial image generation with landmark-based control User Guidelines: To use the dataset, train models on the training set using facial landmarks as control signals. For testing, generate images with landmarks as control and evaluate control consistency error between input and generated image's landmarks. Please cite the OFT paper when using this dataset and protocol. Note: Example usage and evaluation script will come out soon in Huggingface PEFT and Diffusers example. Stay tuned:D Citation:
[ "# Dataset Card for Control-CelebA-HQ", "## Overview\nDataset Name: Control-CelebA-HQ \nDescription: An enhanced version of the CelebA-HQ dataset, Control-CelebA-HQ is specifically designed for evaluating the controlling ability of controllable generative models. This dataset is featured in the NeurIPS 2023 work titled \"Controlling Text-to-Image Diffusion by Orthogonal Finetuning (OFT)\", and is pivotal in evaluating the control ability of the controllable generative models. \nDataset Type: Generative Model, Controllable Generation, PEFT \nOfficial Page: URL", "## Dataset Structure\nData Format: Images with paired facial landmarks \nSize: Training set - 29.5k images; Testing set - 500 images \nResolution: High Quality (CelebA-HQ standard) \nAttributes: Facial features with color-coded facial landmarks for controllable generation", "## Data Collection and Preparation\nSource: Derived from the CelebA-HQ dataset \nCollection Method: Original CelebA-HQ images processed with a standard face alignment tracker (available at URL for facial landmark detection \nData Split: 29.5k images for training, 500 images for testing", "## Dataset Use and Access\nRecommended Uses: Training and testing controllable generative models, particularly in the context of facial image generation with landmark-based control \nUser Guidelines: To use the dataset, train models on the training set using facial landmarks as control signals. For testing, generate images with landmarks as control and evaluate control consistency error between input and generated image's landmarks. Please cite the OFT paper when using this dataset and protocol. \n\nNote: Example usage and evaluation script will come out soon in Huggingface PEFT and Diffusers example. Stay tuned:D\n\nCitation:" ]
[ "TAGS\n#license-mit #region-us \n", "# Dataset Card for Control-CelebA-HQ", "## Overview\nDataset Name: Control-CelebA-HQ \nDescription: An enhanced version of the CelebA-HQ dataset, Control-CelebA-HQ is specifically designed for evaluating the controlling ability of controllable generative models. This dataset is featured in the NeurIPS 2023 work titled \"Controlling Text-to-Image Diffusion by Orthogonal Finetuning (OFT)\", and is pivotal in evaluating the control ability of the controllable generative models. \nDataset Type: Generative Model, Controllable Generation, PEFT \nOfficial Page: URL", "## Dataset Structure\nData Format: Images with paired facial landmarks \nSize: Training set - 29.5k images; Testing set - 500 images \nResolution: High Quality (CelebA-HQ standard) \nAttributes: Facial features with color-coded facial landmarks for controllable generation", "## Data Collection and Preparation\nSource: Derived from the CelebA-HQ dataset \nCollection Method: Original CelebA-HQ images processed with a standard face alignment tracker (available at URL for facial landmark detection \nData Split: 29.5k images for training, 500 images for testing", "## Dataset Use and Access\nRecommended Uses: Training and testing controllable generative models, particularly in the context of facial image generation with landmark-based control \nUser Guidelines: To use the dataset, train models on the training set using facial landmarks as control signals. For testing, generate images with landmarks as control and evaluate control consistency error between input and generated image's landmarks. Please cite the OFT paper when using this dataset and protocol. \n\nNote: Example usage and evaluation script will come out soon in Huggingface PEFT and Diffusers example. Stay tuned:D\n\nCitation:" ]
[ 11, 12, 131, 63, 67, 134 ]
[ "passage: TAGS\n#license-mit #region-us \n# Dataset Card for Control-CelebA-HQ## Overview\nDataset Name: Control-CelebA-HQ \nDescription: An enhanced version of the CelebA-HQ dataset, Control-CelebA-HQ is specifically designed for evaluating the controlling ability of controllable generative models. This dataset is featured in the NeurIPS 2023 work titled \"Controlling Text-to-Image Diffusion by Orthogonal Finetuning (OFT)\", and is pivotal in evaluating the control ability of the controllable generative models. \nDataset Type: Generative Model, Controllable Generation, PEFT \nOfficial Page: URL## Dataset Structure\nData Format: Images with paired facial landmarks \nSize: Training set - 29.5k images; Testing set - 500 images \nResolution: High Quality (CelebA-HQ standard) \nAttributes: Facial features with color-coded facial landmarks for controllable generation## Data Collection and Preparation\nSource: Derived from the CelebA-HQ dataset \nCollection Method: Original CelebA-HQ images processed with a standard face alignment tracker (available at URL for facial landmark detection \nData Split: 29.5k images for training, 500 images for testing## Dataset Use and Access\nRecommended Uses: Training and testing controllable generative models, particularly in the context of facial image generation with landmark-based control \nUser Guidelines: To use the dataset, train models on the training set using facial landmarks as control signals. For testing, generate images with landmarks as control and evaluate control consistency error between input and generated image's landmarks. Please cite the OFT paper when using this dataset and protocol. \n\nNote: Example usage and evaluation script will come out soon in Huggingface PEFT and Diffusers example. Stay tuned:D\n\nCitation:" ]
b2945b3547480de95ee6440c8894c20c598d8f97
# RSSCN7 ![RSSCN7](./thumbnail.png) ## Description The RSSCN7 dataset is designed for scene classification tasks and provides a collection of high-resolution RGB images. This dataset comprises a total of 2,800 images, each with a resolution of 400x400 pixels. The images are extracted from [Google Earth](https://earth.google.com/web/) and cover a range of diverse scenes. RSSCN7 includes seven distinct scene classes, with 400 images per class. ## Details ### Statistics - Total Number of Images: 2,800 - Image Resolution: 400x400 pixels - Scene Classes: 7 - Dataset Size: 0.36GB ## Citation If you use the RSSCN7 dataset in your research, please consider citing the following publication or the dataset's official website: ```bibtex @article{7272047, title = {Deep Learning Based Feature Selection for Remote Sensing Scene Classification}, author = {Zou, Qin and Ni, Lihao and Zhang, Tong and Wang, Qian}, year = 2015, journal = {IEEE Geoscience and Remote Sensing Letters}, volume = 12, number = 11, pages = {2321--2325}, doi = {10.1109/LGRS.2015.2475299} } ``` - Paper with code: https://paperswithcode.com/dataset/rsscn7 - Repo: https://github.com/palewithout/RSSCN7
blanchon/RSSCN7
[ "task_categories:image-classification", "size_categories:1K<n<10K", "language:en", "license:unknown", "remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-classification", "region:us" ]
2023-12-04T15:18:14+00:00
{"language": ["en"], "license": ["unknown"], "size_categories": ["1K<n<10K"], "task_categories": ["image-classification"], "pretty_name": "RSSCN7", "tags": ["remote-sensing", "earth-observation", "geospatial", "satellite-imagery", "scene-classification"]}
2023-12-04T15:20:40+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #region-us
# RSSCN7 !RSSCN7 ## Description The RSSCN7 dataset is designed for scene classification tasks and provides a collection of high-resolution RGB images. This dataset comprises a total of 2,800 images, each with a resolution of 400x400 pixels. The images are extracted from Google Earth and cover a range of diverse scenes. RSSCN7 includes seven distinct scene classes, with 400 images per class. ## Details ### Statistics - Total Number of Images: 2,800 - Image Resolution: 400x400 pixels - Scene Classes: 7 - Dataset Size: 0.36GB If you use the RSSCN7 dataset in your research, please consider citing the following publication or the dataset's official website: - Paper with code: URL - Repo: URL
[ "# RSSCN7\n\n!RSSCN7", "## Description\n\nThe RSSCN7 dataset is designed for scene classification tasks and provides a collection of high-resolution RGB images. This dataset comprises a total of 2,800 images, each with a resolution of 400x400 pixels. The images are extracted from Google Earth and cover a range of diverse scenes. RSSCN7 includes seven distinct scene classes, with 400 images per class.", "## Details", "### Statistics\n- Total Number of Images: 2,800\n- Image Resolution: 400x400 pixels\n- Scene Classes: 7\n- Dataset Size: 0.36GB\n\nIf you use the RSSCN7 dataset in your research, please consider citing the following publication or the dataset's official website:\n\n\n\n- Paper with code: URL\n- Repo: URL" ]
[ "TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #region-us \n", "# RSSCN7\n\n!RSSCN7", "## Description\n\nThe RSSCN7 dataset is designed for scene classification tasks and provides a collection of high-resolution RGB images. This dataset comprises a total of 2,800 images, each with a resolution of 400x400 pixels. The images are extracted from Google Earth and cover a range of diverse scenes. RSSCN7 includes seven distinct scene classes, with 400 images per class.", "## Details", "### Statistics\n- Total Number of Images: 2,800\n- Image Resolution: 400x400 pixels\n- Scene Classes: 7\n- Dataset Size: 0.36GB\n\nIf you use the RSSCN7 dataset in your research, please consider citing the following publication or the dataset's official website:\n\n\n\n- Paper with code: URL\n- Repo: URL" ]
[ 70, 8, 84, 2, 77 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-1K<n<10K #language-English #license-unknown #remote-sensing #earth-observation #geospatial #satellite-imagery #scene-classification #region-us \n# RSSCN7\n\n!RSSCN7## Description\n\nThe RSSCN7 dataset is designed for scene classification tasks and provides a collection of high-resolution RGB images. This dataset comprises a total of 2,800 images, each with a resolution of 400x400 pixels. The images are extracted from Google Earth and cover a range of diverse scenes. RSSCN7 includes seven distinct scene classes, with 400 images per class.## Details### Statistics\n- Total Number of Images: 2,800\n- Image Resolution: 400x400 pixels\n- Scene Classes: 7\n- Dataset Size: 0.36GB\n\nIf you use the RSSCN7 dataset in your research, please consider citing the following publication or the dataset's official website:\n\n\n\n- Paper with code: URL\n- Repo: URL" ]
601cafc569e74ceb41c12aae4ea64806dd933b61
# Dataset Card This dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for some movies from the original dataset. It is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. This dataset has a bigger version [here](https://huggingface.co/datasets/mt0rm0/movie_descriptors). ## Dataset Details ### Dataset Description The dataset has 28655 rows and 3 columns: - 'name': includes the title of the movies - 'release_year': indicates the year of release - 'overview': provides a brief description of each movie, used for advertisement. The source dataset was filtered for keeping only movies with complete metadata in the required fields, a vote average of at least 6, with more than 100 votes and with a revenue over 2 Million Dollars. **Curated by:** [Mario Tormo Romero](https://huggingface.co/mt0rm0) **Language(s) (NLP):** English **License:** cc0-1.0 ### Dataset Sources This Dataset is a subset of Kaggle's [The Movie Dataset](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset). We have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete. The original Dataset has a cc0-1.0 License and we have maintained it. ## Uses This is a toy dataset created for pegagogical purposes, and is used in the **Working with embeddings** Workshop created and organized by the [AI Service Center Berlin-Brandenburg](https://hpi.de/kisz/) at the [Hasso Plattner Institute](https://hpi.de/). ## Dataset Creation ### Curation Rationale We want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets with just way too much information. ### Source Data Our source is Kaggle's The Movie Dataset, so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017. #### Data Collection and Processing The data was downloaded from [Kaggle](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset) as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted. The data was processed with the following code: ```python import pandas as pd # load the csv file df = pd.read_csv("movies_metadata.csv", low_memory=False) # filter movies according to: # - vote average of at least 6 # - more than 100 votes # - revenue over 2M$ df = df.loc[(df.vote_average >= 6)&(df.vote_count > 100)&(df.revenue > 2e6)] # select the required columns, drop rows with missing values and # reset the index df = df.loc[:, ['title', 'release_date', 'overview']] df = df.dropna(axis=0).reset_index(drop=True) # make a new column with the release year df.loc[:, 'release_year'] = pd.to_datetime(df.release_date).dt.year # select the columns in the desired order df = df.loc[:, ['title', 'release_year', 'overview']] # save the data to parquet df.to_parquet('descriptors_data.parquet') ``` #### Who are the source data producers? The source dataset is an ensemble of data collected by [Rounak Banik](https://www.kaggle.com/rounakbanik) from TMDB and GroupLens. In particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb.
mt0rm0/movie_descriptors_small
[ "task_categories:sentence-similarity", "size_categories:10K<n<100K", "language:en", "license:cc0-1.0", "movies", "embeddings", "semantic search", "films", "hpi", "workshop", "region:us" ]
2023-12-04T15:25:21+00:00
{"language": ["en"], "license": "cc0-1.0", "size_categories": ["10K<n<100K"], "task_categories": ["sentence-similarity"], "pretty_name": "\"Movie descriptors for Semantic Search\"", "tags": ["movies", "embeddings", "semantic search", "films", "hpi", "workshop"]}
2023-12-04T16:01:45+00:00
[]
[ "en" ]
TAGS #task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us
# Dataset Card This dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for some movies from the original dataset. It is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. This dataset has a bigger version here. ## Dataset Details ### Dataset Description The dataset has 28655 rows and 3 columns: - 'name': includes the title of the movies - 'release_year': indicates the year of release - 'overview': provides a brief description of each movie, used for advertisement. The source dataset was filtered for keeping only movies with complete metadata in the required fields, a vote average of at least 6, with more than 100 votes and with a revenue over 2 Million Dollars. Curated by: Mario Tormo Romero Language(s) (NLP): English License: cc0-1.0 ### Dataset Sources This Dataset is a subset of Kaggle's The Movie Dataset. We have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete. The original Dataset has a cc0-1.0 License and we have maintained it. ## Uses This is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. ## Dataset Creation ### Curation Rationale We want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets with just way too much information. ### Source Data Our source is Kaggle's The Movie Dataset, so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017. #### Data Collection and Processing The data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted. The data was processed with the following code: #### Who are the source data producers? The source dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens. In particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb.
[ "# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for some movies from the original dataset.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a bigger version here.", "## Dataset Details", "### Dataset Description\n\nThe dataset has 28655 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nThe source dataset was filtered for keeping only movies with complete metadata in the required fields, a vote average of at least 6, with more than 100 votes and with a revenue over 2 Million Dollars.\n \nCurated by: Mario Tormo Romero\n \nLanguage(s) (NLP): English\n\nLicense: cc0-1.0", "### Dataset Sources\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.", "## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.", "## Dataset Creation", "### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets with just way too much information.", "### Source Data\n\nOur source is Kaggle's The Movie Dataset, so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017.", "#### Data Collection and Processing\n\nThe data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted.\nThe data was processed with the following code:", "#### Who are the source data producers?\nThe source dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens.\nIn particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb." ]
[ "TAGS\n#task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us \n", "# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for some movies from the original dataset.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a bigger version here.", "## Dataset Details", "### Dataset Description\n\nThe dataset has 28655 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nThe source dataset was filtered for keeping only movies with complete metadata in the required fields, a vote average of at least 6, with more than 100 votes and with a revenue over 2 Million Dollars.\n \nCurated by: Mario Tormo Romero\n \nLanguage(s) (NLP): English\n\nLicense: cc0-1.0", "### Dataset Sources\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.", "## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.", "## Dataset Creation", "### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets with just way too much information.", "### Source Data\n\nOur source is Kaggle's The Movie Dataset, so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017.", "#### Data Collection and Processing\n\nThe data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted.\nThe data was processed with the following code:", "#### Who are the source data producers?\nThe source dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens.\nIn particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb." ]
[ 63, 85, 4, 132, 89, 54, 5, 42, 41, 53, 70 ]
[ "passage: TAGS\n#task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us \n# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for some movies from the original dataset.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a bigger version here.## Dataset Details### Dataset Description\n\nThe dataset has 28655 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nThe source dataset was filtered for keeping only movies with complete metadata in the required fields, a vote average of at least 6, with more than 100 votes and with a revenue over 2 Million Dollars.\n \nCurated by: Mario Tormo Romero\n \nLanguage(s) (NLP): English\n\nLicense: cc0-1.0### Dataset Sources\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.## Dataset Creation### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets with just way too much information." ]
b8f239deca71140cee088d806ef50c0ed66b2b4f
# Dataset Card for "ai-generated-text-classification" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ziq/ai-generated-text-classification
[ "region:us" ]
2023-12-04T15:29:35+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "prompt_id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "generated", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4411048, "num_examples": 1378}, {"name": "test", "num_bytes": 133, "num_examples": 3}], "download_size": 2360284, "dataset_size": 4411181}}
2023-12-04T15:58:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ai-generated-text-classification" More Information needed
[ "# Dataset Card for \"ai-generated-text-classification\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ai-generated-text-classification\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"ai-generated-text-classification\"\n\nMore Information needed" ]
6511ca0e0959161e85eb382ba3d4cd3d84a4d9f3
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-flan - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-200step-flan](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-flan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-flan", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:41:52.746166](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-flan/blob/main/results_2023-12-04T15-41-52.746166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4851995546077363, "acc_stderr": 0.034286883868394735, "acc_norm": 0.4898964195341055, "acc_norm_stderr": 0.03504171929947504, "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.45466126872222284, "mc2_stderr": 0.015592630900447884 }, "harness|arc:challenge|25": { "acc": 0.492320819112628, "acc_stderr": 0.014609667440892574, "acc_norm": 0.5247440273037542, "acc_norm_stderr": 0.01459348769493774 }, "harness|hellaswag|10": { "acc": 0.5925114519020116, "acc_stderr": 0.004903628887264536, "acc_norm": 0.7802230631348337, "acc_norm_stderr": 0.0041324914757278775 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4222222222222222, "acc_stderr": 0.04266763404099582, "acc_norm": 0.4222222222222222, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.46710526315789475, "acc_stderr": 0.040601270352363966, "acc_norm": 0.46710526315789475, "acc_norm_stderr": 0.040601270352363966 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5433962264150943, "acc_stderr": 0.03065674869673943, "acc_norm": 0.5433962264150943, "acc_norm_stderr": 0.03065674869673943 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5208333333333334, "acc_stderr": 0.041775789507399935, "acc_norm": 0.5208333333333334, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939098, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4046242774566474, "acc_stderr": 0.03742461193887248, "acc_norm": 0.4046242774566474, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4127659574468085, "acc_stderr": 0.03218471141400351, "acc_norm": 0.4127659574468085, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939392, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939392 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29894179894179895, "acc_stderr": 0.023577604791655805, "acc_norm": 0.29894179894179895, "acc_norm_stderr": 0.023577604791655805 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.24603174603174602, "acc_stderr": 0.03852273364924314, "acc_norm": 0.24603174603174602, "acc_norm_stderr": 0.03852273364924314 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761005, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761005 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3694581280788177, "acc_stderr": 0.033959703819985726, "acc_norm": 0.3694581280788177, "acc_norm_stderr": 0.033959703819985726 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5878787878787879, "acc_stderr": 0.03843566993588717, "acc_norm": 0.5878787878787879, "acc_norm_stderr": 0.03843566993588717 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6060606060606061, "acc_stderr": 0.03481285338232963, "acc_norm": 0.6060606060606061, "acc_norm_stderr": 0.03481285338232963 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7253886010362695, "acc_stderr": 0.03221024508041153, "acc_norm": 0.7253886010362695, "acc_norm_stderr": 0.03221024508041153 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4230769230769231, "acc_stderr": 0.02504919787604234, "acc_norm": 0.4230769230769231, "acc_norm_stderr": 0.02504919787604234 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.02684205787383371, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.02684205787383371 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.42016806722689076, "acc_stderr": 0.03206183783236152, "acc_norm": 0.42016806722689076, "acc_norm_stderr": 0.03206183783236152 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.036848815213890225, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.036848815213890225 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6770642201834862, "acc_stderr": 0.02004811592341532, "acc_norm": 0.6770642201834862, "acc_norm_stderr": 0.02004811592341532 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3287037037037037, "acc_stderr": 0.032036140846700596, "acc_norm": 0.3287037037037037, "acc_norm_stderr": 0.032036140846700596 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6617647058823529, "acc_stderr": 0.033205746129454324, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.033205746129454324 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6624472573839663, "acc_stderr": 0.030781549102026226, "acc_norm": 0.6624472573839663, "acc_norm_stderr": 0.030781549102026226 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5695067264573991, "acc_stderr": 0.033231973029429394, "acc_norm": 0.5695067264573991, "acc_norm_stderr": 0.033231973029429394 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5648854961832062, "acc_stderr": 0.04348208051644858, "acc_norm": 0.5648854961832062, "acc_norm_stderr": 0.04348208051644858 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6363636363636364, "acc_stderr": 0.043913262867240704, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.043913262867240704 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6018518518518519, "acc_stderr": 0.04732332615978813, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.04732332615978813 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5644171779141104, "acc_stderr": 0.03895632464138937, "acc_norm": 0.5644171779141104, "acc_norm_stderr": 0.03895632464138937 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280041, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280041 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7222222222222222, "acc_stderr": 0.02934311479809446, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.02934311479809446 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6768837803320562, "acc_stderr": 0.016723726512343048, "acc_norm": 0.6768837803320562, "acc_norm_stderr": 0.016723726512343048 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5173410404624278, "acc_stderr": 0.02690290045866664, "acc_norm": 0.5173410404624278, "acc_norm_stderr": 0.02690290045866664 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.22681564245810057, "acc_stderr": 0.014005843570897899, "acc_norm": 0.22681564245810057, "acc_norm_stderr": 0.014005843570897899 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5130718954248366, "acc_stderr": 0.028620130800700246, "acc_norm": 0.5130718954248366, "acc_norm_stderr": 0.028620130800700246 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5659163987138264, "acc_stderr": 0.02815023224453559, "acc_norm": 0.5659163987138264, "acc_norm_stderr": 0.02815023224453559 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5679012345679012, "acc_stderr": 0.02756301097160668, "acc_norm": 0.5679012345679012, "acc_norm_stderr": 0.02756301097160668 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3494132985658409, "acc_stderr": 0.012177306252786688, "acc_norm": 0.3494132985658409, "acc_norm_stderr": 0.012177306252786688 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.45588235294117646, "acc_stderr": 0.03025437257397668, "acc_norm": 0.45588235294117646, "acc_norm_stderr": 0.03025437257397668 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4918300653594771, "acc_stderr": 0.020225134343057265, "acc_norm": 0.4918300653594771, "acc_norm_stderr": 0.020225134343057265 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5102040816326531, "acc_stderr": 0.03200255347893782, "acc_norm": 0.5102040816326531, "acc_norm_stderr": 0.03200255347893782 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6467661691542289, "acc_stderr": 0.03379790611796777, "acc_norm": 0.6467661691542289, "acc_norm_stderr": 0.03379790611796777 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7192982456140351, "acc_stderr": 0.03446296217088427, "acc_norm": 0.7192982456140351, "acc_norm_stderr": 0.03446296217088427 }, "harness|truthfulqa:mc|0": { "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.45466126872222284, "mc2_stderr": 0.015592630900447884 }, "harness|winogrande|5": { "acc": 0.7269139700078927, "acc_stderr": 0.012522020105869456 }, "harness|gsm8k|5": { "acc": 0.1865049279757392, "acc_stderr": 0.010729140039689897 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-flan
[ "region:us" ]
2023-12-04T15:44:49+00:00
{"pretty_name": "Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan", "dataset_summary": "Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-200step-flan](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-flan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-flan\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:41:52.746166](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-flan/blob/main/results_2023-12-04T15-41-52.746166.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4851995546077363,\n \"acc_stderr\": 0.034286883868394735,\n \"acc_norm\": 0.4898964195341055,\n \"acc_norm_stderr\": 0.03504171929947504,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.45466126872222284,\n \"mc2_stderr\": 0.015592630900447884\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.492320819112628,\n \"acc_stderr\": 0.014609667440892574,\n \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.01459348769493774\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5925114519020116,\n \"acc_stderr\": 0.004903628887264536,\n \"acc_norm\": 0.7802230631348337,\n \"acc_norm_stderr\": 0.0041324914757278775\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655805,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655805\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03481285338232963,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03481285338232963\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341532,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341532\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.033205746129454324,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.033205746129454324\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026226,\n \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026226\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n \"acc_stderr\": 0.014005843570897899,\n \"acc_norm\": 0.22681564245810057,\n \"acc_norm_stderr\": 0.014005843570897899\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n \"acc_stderr\": 0.02815023224453559,\n \"acc_norm\": 0.5659163987138264,\n \"acc_norm_stderr\": 0.02815023224453559\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.02756301097160668,\n \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.02756301097160668\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n \"acc_stderr\": 0.012177306252786688,\n \"acc_norm\": 0.3494132985658409,\n \"acc_norm_stderr\": 0.012177306252786688\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03025437257397668,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03025437257397668\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4918300653594771,\n \"acc_stderr\": 0.020225134343057265,\n \"acc_norm\": 0.4918300653594771,\n \"acc_norm_stderr\": 0.020225134343057265\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.45466126872222284,\n \"mc2_stderr\": 0.015592630900447884\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1865049279757392,\n \"acc_stderr\": 0.010729140039689897\n }\n}\n```", "repo_url": "https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-flan", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-41-52.746166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["**/details_harness|winogrande|5_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-41-52.746166.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_41_52.746166", "path": ["results_2023-12-04T15-41-52.746166.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-41-52.746166.parquet"]}]}]}
2023-12-04T15:45:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-200step-flan on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:41:52.746166(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-200step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:41:52.746166(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-200step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:41:52.746166(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 30, 31, 179, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-flan## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Korabbit/Llama-2-7b-chat-hf-afr-200step-flan on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:41:52.746166(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e95d34a64be6885cb5a751cf28e3445859d9d492
# Dataset Card for "t5_small_test_set_context_len_64" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yardeny/t5_small_test_set_context_len_64
[ "region:us" ]
2023-12-04T15:45:18+00:00
{"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 419840, "num_examples": 1280}], "download_size": 171561, "dataset_size": 419840}}
2023-12-04T15:45:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "t5_small_test_set_context_len_64" More Information needed
[ "# Dataset Card for \"t5_small_test_set_context_len_64\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"t5_small_test_set_context_len_64\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"t5_small_test_set_context_len_64\"\n\nMore Information needed" ]
ee2136511a13ec169d2a32b78c3c495c399f7f0f
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Intel/neural-chat-7b-v3-2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-2](https://huggingface.co/Intel/neural-chat-7b-v3-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Intel__neural-chat-7b-v3-2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:53:32.280845](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-2/blob/main/results_2023-12-04T15-53-32.280845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6369210872251465, "acc_stderr": 0.0326377312636888, "acc_norm": 0.6396211867921318, "acc_norm_stderr": 0.03329595828089076, "mc1": 0.43818849449204406, "mc1_stderr": 0.017369236164404445, "mc2": 0.596824313919398, "mc2_stderr": 0.015111088211554574 }, "harness|arc:challenge|25": { "acc": 0.636518771331058, "acc_stderr": 0.014056207319068283, "acc_norm": 0.6749146757679181, "acc_norm_stderr": 0.013688147309729124 }, "harness|hellaswag|10": { "acc": 0.6414060944035053, "acc_stderr": 0.004786075107572188, "acc_norm": 0.8391754630551683, "acc_norm_stderr": 0.0036661823284423437 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119668, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119668 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137602, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137602 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.02366421667164251, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.02366421667164251 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03053289223393202, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03053289223393202 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.028897748741131143, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.028897748741131143 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887037, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887037 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.015848255806501534, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.015848255806501534 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639318, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601457, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229146, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229146 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467766, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467766 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165623, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165623 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8109833971902938, "acc_stderr": 0.014000791294406999, "acc_norm": 0.8109833971902938, "acc_norm_stderr": 0.014000791294406999 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436596, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436596 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37988826815642457, "acc_stderr": 0.01623282681867849, "acc_norm": 0.37988826815642457, "acc_norm_stderr": 0.01623282681867849 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.02617390850671858, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.02617390850671858 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43415906127770537, "acc_stderr": 0.012659033237067248, "acc_norm": 0.43415906127770537, "acc_norm_stderr": 0.012659033237067248 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146292, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146292 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223974, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223974 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784603, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784603 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.038612291966536934, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.43818849449204406, "mc1_stderr": 0.017369236164404445, "mc2": 0.596824313919398, "mc2_stderr": 0.015111088211554574 }, "harness|winogrande|5": { "acc": 0.7995264404104183, "acc_stderr": 0.011251958281205083 }, "harness|gsm8k|5": { "acc": 0.5511751326762699, "acc_stderr": 0.013700157442788066 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Intel__neural-chat-7b-v3-2
[ "region:us" ]
2023-12-04T15:56:23+00:00
{"pretty_name": "Evaluation run of Intel/neural-chat-7b-v3-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-2](https://huggingface.co/Intel/neural-chat-7b-v3-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Intel__neural-chat-7b-v3-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:53:32.280845](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-2/blob/main/results_2023-12-04T15-53-32.280845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369210872251465,\n \"acc_stderr\": 0.0326377312636888,\n \"acc_norm\": 0.6396211867921318,\n \"acc_norm_stderr\": 0.03329595828089076,\n \"mc1\": 0.43818849449204406,\n \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.596824313919398,\n \"mc2_stderr\": 0.015111088211554574\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.636518771331058,\n \"acc_stderr\": 0.014056207319068283,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n \"acc_stderr\": 0.004786075107572188,\n \"acc_norm\": 0.8391754630551683,\n \"acc_norm_stderr\": 0.0036661823284423437\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601457,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601457\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436596,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436596\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.01623282681867849,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.01623282681867849\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223974,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223974\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43818849449204406,\n \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.596824313919398,\n \"mc2_stderr\": 0.015111088211554574\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205083\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5511751326762699,\n \"acc_stderr\": 0.013700157442788066\n }\n}\n```", "repo_url": "https://huggingface.co/Intel/neural-chat-7b-v3-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-53-32.280845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["**/details_harness|winogrande|5_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-53-32.280845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_53_32.280845", "path": ["results_2023-12-04T15-53-32.280845.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-53-32.280845.parquet"]}]}]}
2023-12-04T15:57:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:53:32.280845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:53:32.280845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:53:32.280845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Intel/neural-chat-7b-v3-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:53:32.280845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
6c19eaf3e7d7465b1ba5bf7be1e58c4e47849a3b
# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Weyaxi/zephyr-alpha-Nebula-v2-7B](https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:57:31.199945](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B/blob/main/results_2023-12-04T15-57-31.199945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5652954998411592, "acc_stderr": 0.03358458028499026, "acc_norm": 0.5715669236783297, "acc_norm_stderr": 0.03429444097066663, "mc1": 0.4283965728274174, "mc1_stderr": 0.01732308859731475, "mc2": 0.5827588614927685, "mc2_stderr": 0.015689365398538633 }, "harness|arc:challenge|25": { "acc": 0.5631399317406144, "acc_stderr": 0.014494421584256524, "acc_norm": 0.5861774744027304, "acc_norm_stderr": 0.014392730009221009 }, "harness|hellaswag|10": { "acc": 0.6397132045409281, "acc_stderr": 0.004791024004588008, "acc_norm": 0.8305118502290381, "acc_norm_stderr": 0.003744157442536553 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5811320754716981, "acc_stderr": 0.030365050829115215, "acc_norm": 0.5811320754716981, "acc_norm_stderr": 0.030365050829115215 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.625, "acc_stderr": 0.04048439222695598, "acc_norm": 0.625, "acc_norm_stderr": 0.04048439222695598 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.0372424959581773, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.0372424959581773 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082634, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082634 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4978723404255319, "acc_stderr": 0.03268572658667492, "acc_norm": 0.4978723404255319, "acc_norm_stderr": 0.03268572658667492 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.04166567577101579, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.0250107491161376, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.0250107491161376 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.04451807959055328, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.04451807959055328 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6903225806451613, "acc_stderr": 0.026302774983517418, "acc_norm": 0.6903225806451613, "acc_norm_stderr": 0.026302774983517418 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4433497536945813, "acc_stderr": 0.03495334582162933, "acc_norm": 0.4433497536945813, "acc_norm_stderr": 0.03495334582162933 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7090909090909091, "acc_stderr": 0.03546563019624335, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.03546563019624335 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124498, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124498 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7512953367875648, "acc_stderr": 0.031195840877700293, "acc_norm": 0.7512953367875648, "acc_norm_stderr": 0.031195840877700293 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5717948717948718, "acc_stderr": 0.025088301454694827, "acc_norm": 0.5717948717948718, "acc_norm_stderr": 0.025088301454694827 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.03218358107742613, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.03218358107742613 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242741, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242741 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.016659279700295813, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.016659279700295813 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.03350991604696042, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.03350991604696042 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7107843137254902, "acc_stderr": 0.031822318676475544, "acc_norm": 0.7107843137254902, "acc_norm_stderr": 0.031822318676475544 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7426160337552743, "acc_stderr": 0.028458820991460302, "acc_norm": 0.7426160337552743, "acc_norm_stderr": 0.028458820991460302 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330314, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330314 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6183206106870229, "acc_stderr": 0.042607351576445594, "acc_norm": 0.6183206106870229, "acc_norm_stderr": 0.042607351576445594 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724146, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724146 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.29464285714285715, "acc_stderr": 0.0432704093257873, "acc_norm": 0.29464285714285715, "acc_norm_stderr": 0.0432704093257873 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.025819233256483717, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.025819233256483717 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7701149425287356, "acc_stderr": 0.015046301846691812, "acc_norm": 0.7701149425287356, "acc_norm_stderr": 0.015046301846691812 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.615606936416185, "acc_stderr": 0.026189666966272035, "acc_norm": 0.615606936416185, "acc_norm_stderr": 0.026189666966272035 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27150837988826815, "acc_stderr": 0.014874252168095268, "acc_norm": 0.27150837988826815, "acc_norm_stderr": 0.014874252168095268 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5849673202614379, "acc_stderr": 0.028213504177824103, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.028213504177824103 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6574074074074074, "acc_stderr": 0.02640614597362567, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.02640614597362567 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4326241134751773, "acc_stderr": 0.029555454236778852, "acc_norm": 0.4326241134751773, "acc_norm_stderr": 0.029555454236778852 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.439374185136897, "acc_stderr": 0.012676014778580212, "acc_norm": 0.439374185136897, "acc_norm_stderr": 0.012676014778580212 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5735294117647058, "acc_stderr": 0.030042615832714874, "acc_norm": 0.5735294117647058, "acc_norm_stderr": 0.030042615832714874 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5441176470588235, "acc_stderr": 0.020148939420415745, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.020148939420415745 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5183673469387755, "acc_stderr": 0.03198761546763127, "acc_norm": 0.5183673469387755, "acc_norm_stderr": 0.03198761546763127 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7611940298507462, "acc_stderr": 0.030147775935409217, "acc_norm": 0.7611940298507462, "acc_norm_stderr": 0.030147775935409217 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.4578313253012048, "acc_stderr": 0.0387862677100236, "acc_norm": 0.4578313253012048, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.031267817146631786, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.031267817146631786 }, "harness|truthfulqa:mc|0": { "mc1": 0.4283965728274174, "mc1_stderr": 0.01732308859731475, "mc2": 0.5827588614927685, "mc2_stderr": 0.015689365398538633 }, "harness|winogrande|5": { "acc": 0.7355958958168903, "acc_stderr": 0.012394724896983796 }, "harness|gsm8k|5": { "acc": 0.23881728582259287, "acc_stderr": 0.011744097081003803 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B
[ "region:us" ]
2023-12-04T16:00:23+00:00
{"pretty_name": "Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/zephyr-alpha-Nebula-v2-7B](https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:57:31.199945](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B/blob/main/results_2023-12-04T15-57-31.199945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5652954998411592,\n \"acc_stderr\": 0.03358458028499026,\n \"acc_norm\": 0.5715669236783297,\n \"acc_norm_stderr\": 0.03429444097066663,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.01732308859731475,\n \"mc2\": 0.5827588614927685,\n \"mc2_stderr\": 0.015689365398538633\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256524,\n \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6397132045409281,\n \"acc_stderr\": 0.004791024004588008,\n \"acc_norm\": 0.8305118502290381,\n \"acc_norm_stderr\": 0.003744157442536553\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115215,\n \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115215\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082634,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082634\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.0250107491161376,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.0250107491161376\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700293,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700293\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694827,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694827\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295813,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295813\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n \"acc_stderr\": 0.015046301846691812,\n \"acc_norm\": 0.7701149425287356,\n \"acc_norm_stderr\": 0.015046301846691812\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.27150837988826815,\n \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824103,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824103\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.02640614597362567,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.02640614597362567\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n \"acc_stderr\": 0.012676014778580212,\n \"acc_norm\": 0.439374185136897,\n \"acc_norm_stderr\": 0.012676014778580212\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714874,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714874\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.01732308859731475,\n \"mc2\": 0.5827588614927685,\n \"mc2_stderr\": 0.015689365398538633\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983796\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23881728582259287,\n \"acc_stderr\": 0.011744097081003803\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["**/details_harness|winogrande|5_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-57-31.199945.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_57_31.199945", "path": ["results_2023-12-04T15-57-31.199945.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-57-31.199945.parquet"]}]}]}
2023-12-04T16:01:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Weyaxi/zephyr-alpha-Nebula-v2-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:57:31.199945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/zephyr-alpha-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:57:31.199945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/zephyr-alpha-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:57:31.199945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Weyaxi/zephyr-alpha-Nebula-v2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:57:31.199945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
986f7b02017660be479d46bafeb7145a1e89c70f
# Dataset Card for Evaluation run of uukuguy/speechless-tools-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/speechless-tools-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/speechless-tools-7b](https://huggingface.co/uukuguy/speechless-tools-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-tools-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T15:59:01.119688](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tools-7b/blob/main/results_2023-12-04T15-59-01.119688.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.33439260458192543, "acc_stderr": 0.03318510046683825, "acc_norm": 0.3375068000778248, "acc_norm_stderr": 0.03395842042621018, "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219371, "mc2": 0.4408018939045002, "mc2_stderr": 0.015451244968527669 }, "harness|arc:challenge|25": { "acc": 0.3412969283276451, "acc_stderr": 0.013855831287497717, "acc_norm": 0.3890784982935154, "acc_norm_stderr": 0.014247309976045607 }, "harness|hellaswag|10": { "acc": 0.44901414060944034, "acc_stderr": 0.004963771168672087, "acc_norm": 0.5768771161123282, "acc_norm_stderr": 0.004930448527146669 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.31851851851851853, "acc_stderr": 0.040247784019771096, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.28289473684210525, "acc_stderr": 0.03665349695640767, "acc_norm": 0.28289473684210525, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.36981132075471695, "acc_stderr": 0.029711421880107922, "acc_norm": 0.36981132075471695, "acc_norm_stderr": 0.029711421880107922 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2708333333333333, "acc_stderr": 0.037161774375660164, "acc_norm": 0.2708333333333333, "acc_norm_stderr": 0.037161774375660164 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.17, "acc_stderr": 0.0377525168068637, "acc_norm": 0.17, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2947976878612717, "acc_stderr": 0.034765996075164785, "acc_norm": 0.2947976878612717, "acc_norm_stderr": 0.034765996075164785 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2936170212765957, "acc_stderr": 0.029771642712491234, "acc_norm": 0.2936170212765957, "acc_norm_stderr": 0.029771642712491234 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.24561403508771928, "acc_stderr": 0.04049339297748142, "acc_norm": 0.24561403508771928, "acc_norm_stderr": 0.04049339297748142 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.3586206896551724, "acc_stderr": 0.03996629574876719, "acc_norm": 0.3586206896551724, "acc_norm_stderr": 0.03996629574876719 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.26455026455026454, "acc_stderr": 0.022717467897708617, "acc_norm": 0.26455026455026454, "acc_norm_stderr": 0.022717467897708617 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.23015873015873015, "acc_stderr": 0.03764950879790606, "acc_norm": 0.23015873015873015, "acc_norm_stderr": 0.03764950879790606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3225806451612903, "acc_stderr": 0.026593084516572284, "acc_norm": 0.3225806451612903, "acc_norm_stderr": 0.026593084516572284 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.031618563353586086, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.031618563353586086 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.4121212121212121, "acc_stderr": 0.03843566993588718, "acc_norm": 0.4121212121212121, "acc_norm_stderr": 0.03843566993588718 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3939393939393939, "acc_stderr": 0.03481285338232963, "acc_norm": 0.3939393939393939, "acc_norm_stderr": 0.03481285338232963 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.34196891191709844, "acc_stderr": 0.034234651001042844, "acc_norm": 0.34196891191709844, "acc_norm_stderr": 0.034234651001042844 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.31794871794871793, "acc_stderr": 0.023610884308927858, "acc_norm": 0.31794871794871793, "acc_norm_stderr": 0.023610884308927858 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23333333333333334, "acc_stderr": 0.025787874220959323, "acc_norm": 0.23333333333333334, "acc_norm_stderr": 0.025787874220959323 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.29831932773109243, "acc_stderr": 0.029719142876342853, "acc_norm": 0.29831932773109243, "acc_norm_stderr": 0.029719142876342853 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3798165137614679, "acc_stderr": 0.020808825617866244, "acc_norm": 0.3798165137614679, "acc_norm_stderr": 0.020808825617866244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.21296296296296297, "acc_stderr": 0.027920963147993662, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.027920963147993662 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.36764705882352944, "acc_stderr": 0.03384132045674119, "acc_norm": 0.36764705882352944, "acc_norm_stderr": 0.03384132045674119 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.4430379746835443, "acc_stderr": 0.03233532777533485, "acc_norm": 0.4430379746835443, "acc_norm_stderr": 0.03233532777533485 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3452914798206278, "acc_stderr": 0.03191100192835794, "acc_norm": 0.3452914798206278, "acc_norm_stderr": 0.03191100192835794 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.40458015267175573, "acc_stderr": 0.043046937953806645, "acc_norm": 0.40458015267175573, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.4628099173553719, "acc_stderr": 0.045517111961042175, "acc_norm": 0.4628099173553719, "acc_norm_stderr": 0.045517111961042175 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.48148148148148145, "acc_stderr": 0.04830366024635331, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.04830366024635331 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.39263803680981596, "acc_stderr": 0.03836740907831028, "acc_norm": 0.39263803680981596, "acc_norm_stderr": 0.03836740907831028 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.25892857142857145, "acc_stderr": 0.041577515398656284, "acc_norm": 0.25892857142857145, "acc_norm_stderr": 0.041577515398656284 }, "harness|hendrycksTest-management|5": { "acc": 0.4077669902912621, "acc_stderr": 0.048657775704107696, "acc_norm": 0.4077669902912621, "acc_norm_stderr": 0.048657775704107696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.5, "acc_stderr": 0.03275608910402091, "acc_norm": 0.5, "acc_norm_stderr": 0.03275608910402091 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.4086845466155811, "acc_stderr": 0.017579250148153393, "acc_norm": 0.4086845466155811, "acc_norm_stderr": 0.017579250148153393 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.38439306358381503, "acc_stderr": 0.026189666966272028, "acc_norm": 0.38439306358381503, "acc_norm_stderr": 0.026189666966272028 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2670391061452514, "acc_stderr": 0.014796502622562567, "acc_norm": 0.2670391061452514, "acc_norm_stderr": 0.014796502622562567 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.32679738562091504, "acc_stderr": 0.026857294663281416, "acc_norm": 0.32679738562091504, "acc_norm_stderr": 0.026857294663281416 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.3665594855305466, "acc_stderr": 0.027368078243971614, "acc_norm": 0.3665594855305466, "acc_norm_stderr": 0.027368078243971614 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02686949074481525, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02686949074481525 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.30141843971631205, "acc_stderr": 0.027374128882631146, "acc_norm": 0.30141843971631205, "acc_norm_stderr": 0.027374128882631146 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2920469361147327, "acc_stderr": 0.011613349136271808, "acc_norm": 0.2920469361147327, "acc_norm_stderr": 0.011613349136271808 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.2426470588235294, "acc_stderr": 0.026040662474201268, "acc_norm": 0.2426470588235294, "acc_norm_stderr": 0.026040662474201268 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.31862745098039214, "acc_stderr": 0.018850084696468712, "acc_norm": 0.31862745098039214, "acc_norm_stderr": 0.018850084696468712 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4, "acc_stderr": 0.0469237132203465, "acc_norm": 0.4, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.3673469387755102, "acc_stderr": 0.030862144921087558, "acc_norm": 0.3673469387755102, "acc_norm_stderr": 0.030862144921087558 }, "harness|hendrycksTest-sociology|5": { "acc": 0.4228855721393035, "acc_stderr": 0.034932317774212816, "acc_norm": 0.4228855721393035, "acc_norm_stderr": 0.034932317774212816 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-virology|5": { "acc": 0.3132530120481928, "acc_stderr": 0.03610805018031024, "acc_norm": 0.3132530120481928, "acc_norm_stderr": 0.03610805018031024 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03615507630310935, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03615507630310935 }, "harness|truthfulqa:mc|0": { "mc1": 0.2741738066095471, "mc1_stderr": 0.015616518497219371, "mc2": 0.4408018939045002, "mc2_stderr": 0.015451244968527669 }, "harness|winogrande|5": { "acc": 0.585635359116022, "acc_stderr": 0.013844846232268558 }, "harness|gsm8k|5": { "acc": 0.07505686125852919, "acc_stderr": 0.007257633145486642 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uukuguy__speechless-tools-7b
[ "region:us" ]
2023-12-04T16:02:01+00:00
{"pretty_name": "Evaluation run of uukuguy/speechless-tools-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-tools-7b](https://huggingface.co/uukuguy/speechless-tools-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-tools-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T15:59:01.119688](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tools-7b/blob/main/results_2023-12-04T15-59-01.119688.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33439260458192543,\n \"acc_stderr\": 0.03318510046683825,\n \"acc_norm\": 0.3375068000778248,\n \"acc_norm_stderr\": 0.03395842042621018,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.4408018939045002,\n \"mc2_stderr\": 0.015451244968527669\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3412969283276451,\n \"acc_stderr\": 0.013855831287497717,\n \"acc_norm\": 0.3890784982935154,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44901414060944034,\n \"acc_stderr\": 0.004963771168672087,\n \"acc_norm\": 0.5768771161123282,\n \"acc_norm_stderr\": 0.004930448527146669\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.029711421880107922,\n \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.029711421880107922\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491234,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491234\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3225806451612903,\n \"acc_stderr\": 0.026593084516572284,\n \"acc_norm\": 0.3225806451612903,\n \"acc_norm_stderr\": 0.026593084516572284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.031618563353586086,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.031618563353586086\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588718,\n \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588718\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.03481285338232963,\n \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03481285338232963\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.34196891191709844,\n \"acc_stderr\": 0.034234651001042844,\n \"acc_norm\": 0.34196891191709844,\n \"acc_norm_stderr\": 0.034234651001042844\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.023610884308927858,\n \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.023610884308927858\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959323,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959323\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3798165137614679,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.3798165137614679,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993662,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993662\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.03384132045674119,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.03384132045674119\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533485,\n \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533485\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4628099173553719,\n \"acc_stderr\": 0.045517111961042175,\n \"acc_norm\": 0.4628099173553719,\n \"acc_norm_stderr\": 0.045517111961042175\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831028,\n \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831028\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.048657775704107696,\n \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.048657775704107696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03275608910402091,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03275608910402091\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4086845466155811,\n \"acc_stderr\": 0.017579250148153393,\n \"acc_norm\": 0.4086845466155811,\n \"acc_norm_stderr\": 0.017579250148153393\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272028,\n \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272028\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2670391061452514,\n \"acc_stderr\": 0.014796502622562567,\n \"acc_norm\": 0.2670391061452514,\n \"acc_norm_stderr\": 0.014796502622562567\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.32679738562091504,\n \"acc_stderr\": 0.026857294663281416,\n \"acc_norm\": 0.32679738562091504,\n \"acc_norm_stderr\": 0.026857294663281416\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n \"acc_stderr\": 0.027368078243971614,\n \"acc_norm\": 0.3665594855305466,\n \"acc_norm_stderr\": 0.027368078243971614\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02686949074481525,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02686949074481525\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631146,\n \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n \"acc_stderr\": 0.011613349136271808,\n \"acc_norm\": 0.2920469361147327,\n \"acc_norm_stderr\": 0.011613349136271808\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201268,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201268\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.31862745098039214,\n \"acc_stderr\": 0.018850084696468712,\n \"acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.018850084696468712\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.030862144921087558,\n \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.030862144921087558\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.4408018939045002,\n \"mc2_stderr\": 0.015451244968527669\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.585635359116022,\n \"acc_stderr\": 0.013844846232268558\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \"acc_stderr\": 0.007257633145486642\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-tools-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T15-59-01.119688.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["**/details_harness|winogrande|5_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T15-59-01.119688.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T15_59_01.119688", "path": ["results_2023-12-04T15-59-01.119688.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T15-59-01.119688.parquet"]}]}]}
2023-12-04T16:02:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uukuguy/speechless-tools-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uukuguy/speechless-tools-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T15:59:01.119688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uukuguy/speechless-tools-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-tools-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:59:01.119688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uukuguy/speechless-tools-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-tools-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T15:59:01.119688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-tools-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-tools-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T15:59:01.119688(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
765870e3b8830a91917aba0e445a62a25e563469
# Dataset Card for Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/zephyr-7b-alpha-dare-0.85 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/zephyr-7b-alpha-dare-0.85](https://huggingface.co/uukuguy/zephyr-7b-alpha-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__zephyr-7b-alpha-dare-0.85", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:03:30.985884](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__zephyr-7b-alpha-dare-0.85/blob/main/results_2023-12-04T16-03-30.985884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6405125012890543, "acc_stderr": 0.0322440782989453, "acc_norm": 0.6457442431541438, "acc_norm_stderr": 0.032888705588954556, "mc1": 0.29498164014687883, "mc1_stderr": 0.015964400965589657, "mc2": 0.4441404853042373, "mc2_stderr": 0.014450558004670922 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.01443803622084803, "acc_norm": 0.6117747440273038, "acc_norm_stderr": 0.01424161420741405 }, "harness|hellaswag|10": { "acc": 0.6387173869747063, "acc_stderr": 0.004793904922401889, "acc_norm": 0.8366859191396137, "acc_norm_stderr": 0.0036889652317335197 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.04793724854411021, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411021 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.03842498559395268, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.03842498559395268 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3941798941798942, "acc_stderr": 0.02516798233389414, "acc_norm": 0.3941798941798942, "acc_norm_stderr": 0.02516798233389414 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919443, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135356, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135356 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.818348623853211, "acc_stderr": 0.016530617409266875, "acc_norm": 0.818348623853211, "acc_norm_stderr": 0.016530617409266875 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5787037037037037, "acc_stderr": 0.033674621388960775, "acc_norm": 0.5787037037037037, "acc_norm_stderr": 0.033674621388960775 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639318, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639318 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7763713080168776, "acc_stderr": 0.027123298205229966, "acc_norm": 0.7763713080168776, "acc_norm_stderr": 0.027123298205229966 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057222, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057222 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.03760178006026621, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.03760178006026621 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406943, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406943 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973136, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973136 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577615, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577615 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3139664804469274, "acc_stderr": 0.015521923933523642, "acc_norm": 0.3139664804469274, "acc_norm_stderr": 0.015521923933523642 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.02600330111788514, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.02600330111788514 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.02474862449053737, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.02474862449053737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.012685906538206242, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.012685906538206242 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.019047485239360378, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.019047485239360378 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454125, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454125 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.29498164014687883, "mc1_stderr": 0.015964400965589657, "mc2": 0.4441404853042373, "mc2_stderr": 0.014450558004670922 }, "harness|winogrande|5": { "acc": 0.7845303867403315, "acc_stderr": 0.011555295286059282 }, "harness|gsm8k|5": { "acc": 0.42077331311599697, "acc_stderr": 0.013598489497182837 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uukuguy__zephyr-7b-alpha-dare-0.85
[ "region:us" ]
2023-12-04T16:06:21+00:00
{"pretty_name": "Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/zephyr-7b-alpha-dare-0.85](https://huggingface.co/uukuguy/zephyr-7b-alpha-dare-0.85) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__zephyr-7b-alpha-dare-0.85\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:03:30.985884](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__zephyr-7b-alpha-dare-0.85/blob/main/results_2023-12-04T16-03-30.985884.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6405125012890543,\n \"acc_stderr\": 0.0322440782989453,\n \"acc_norm\": 0.6457442431541438,\n \"acc_norm_stderr\": 0.032888705588954556,\n \"mc1\": 0.29498164014687883,\n \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4441404853042373,\n \"mc2_stderr\": 0.014450558004670922\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.01443803622084803,\n \"acc_norm\": 0.6117747440273038,\n \"acc_norm_stderr\": 0.01424161420741405\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6387173869747063,\n \"acc_stderr\": 0.004793904922401889,\n \"acc_norm\": 0.8366859191396137,\n \"acc_norm_stderr\": 0.0036889652317335197\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n \"acc_stderr\": 0.015521923933523642,\n \"acc_norm\": 0.3139664804469274,\n \"acc_norm_stderr\": 0.015521923933523642\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4441404853042373,\n \"mc2_stderr\": 0.014450558004670922\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42077331311599697,\n \"acc_stderr\": 0.013598489497182837\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/zephyr-7b-alpha-dare-0.85", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-03-30.985884.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["**/details_harness|winogrande|5_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-03-30.985884.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_03_30.985884", "path": ["results_2023-12-04T16-03-30.985884.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-03-30.985884.parquet"]}]}]}
2023-12-04T16:07:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uukuguy/zephyr-7b-alpha-dare-0.85 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:03:30.985884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/zephyr-7b-alpha-dare-0.85 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:03:30.985884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/zephyr-7b-alpha-dare-0.85 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:03:30.985884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/zephyr-7b-alpha-dare-0.85## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/zephyr-7b-alpha-dare-0.85 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:03:30.985884(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
46fafac5826a6f695a1046ec75bb220120777ecb
This is a compilation of the following datasets - FredZhang7/stable-diffusion-prompts-2.47M - laion/dalle-3-dataset - Multimodal-Fatima/COCO_captions_train - recastai/LAION-art-EN-improved-captions - succinctly/midjourney-prompts - Geonmo/midjourney-prompts-only - daspartho/stable-diffusion-prompts ```python # Filtering filtered = dataset.filter(lambda x: (len(x['prompt'].split()) > 3) and (len(x['prompt'].split()) < 150)).filter(lambda x: (sum(1 for c in x['prompt'] if c.isupper()) < 10)) ``` ```python # Dedup from itertools import cycle from functools import partial memory = set() def is_unique(elem , column: str, memory: set) -> bool: if elem[column] in memory: return False else: memory.add(elem[column]) return True dedup = processed.filter(partial(is_unique, column="prompt", memory=memory)) ```
nreHieW/SD-Prompts
[ "license:apache-2.0", "region:us" ]
2023-12-04T16:09:42+00:00
{"license": "apache-2.0", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1203940244.4386747, "num_examples": 7331040}], "download_size": 559938627, "dataset_size": 1203940244.4386747}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-05T17:23:49+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
This is a compilation of the following datasets - FredZhang7/stable-diffusion-prompts-2.47M - laion/dalle-3-dataset - Multimodal-Fatima/COCO_captions_train - recastai/LAION-art-EN-improved-captions - succinctly/midjourney-prompts - Geonmo/midjourney-prompts-only - daspartho/stable-diffusion-prompts
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
a3c67f61c19c4799aa6522cb86dbdaaf7bae695f
# Dataset Card for Evaluation run of chinoll/Yi-6b-200k-dpo ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chinoll/Yi-6b-200k-dpo - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [chinoll/Yi-6b-200k-dpo](https://huggingface.co/chinoll/Yi-6b-200k-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chinoll__Yi-6b-200k-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:10:17.402126](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-6b-200k-dpo/blob/main/results_2023-12-04T16-10-17.402126.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6274780891690785, "acc_stderr": 0.03214198982171106, "acc_norm": 0.6382309545732996, "acc_norm_stderr": 0.03286487964348697, "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.4551491788416383, "mc2_stderr": 0.014826375266749701 }, "harness|arc:challenge|25": { "acc": 0.39505119453924914, "acc_stderr": 0.014285898292938172, "acc_norm": 0.4308873720136519, "acc_norm_stderr": 0.014471133392642475 }, "harness|hellaswag|10": { "acc": 0.5570603465445131, "acc_stderr": 0.004957182635381807, "acc_norm": 0.7452698665604461, "acc_norm_stderr": 0.004348189459336535 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137282, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137282 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.03196758697835362, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.03196758697835362 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47354497354497355, "acc_stderr": 0.02571523981134676, "acc_norm": 0.47354497354497355, "acc_norm_stderr": 0.02571523981134676 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361009, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361009 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.02503387058301518, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.02503387058301518 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.027722065493361276, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.027722065493361276 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.0399552400768168, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.0399552400768168 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.01577623925616323, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.01577623925616323 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967409, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967409 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.0318114974705536, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794086, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794086 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.02126271940040697, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.02126271940040697 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8109833971902938, "acc_stderr": 0.014000791294407004, "acc_norm": 0.8109833971902938, "acc_norm_stderr": 0.014000791294407004 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3854748603351955, "acc_stderr": 0.016277927039638193, "acc_norm": 0.3854748603351955, "acc_norm_stderr": 0.016277927039638193 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667878, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667878 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7331189710610932, "acc_stderr": 0.025122637608816657, "acc_norm": 0.7331189710610932, "acc_norm_stderr": 0.025122637608816657 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48565840938722293, "acc_stderr": 0.012764981829524265, "acc_norm": 0.48565840938722293, "acc_norm_stderr": 0.012764981829524265 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6470588235294118, "acc_stderr": 0.02902942281568139, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.02902942281568139 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.01897542792050721, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.01897542792050721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.02752963744017493, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.02752963744017493 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.4551491788416383, "mc2_stderr": 0.014826375266749701 }, "harness|winogrande|5": { "acc": 0.7308602999210734, "acc_stderr": 0.012464911951268736 }, "harness|gsm8k|5": { "acc": 0.11372251705837756, "acc_stderr": 0.008744810131034042 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_chinoll__Yi-6b-200k-dpo
[ "region:us" ]
2023-12-04T16:13:00+00:00
{"pretty_name": "Evaluation run of chinoll/Yi-6b-200k-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [chinoll/Yi-6b-200k-dpo](https://huggingface.co/chinoll/Yi-6b-200k-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chinoll__Yi-6b-200k-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:10:17.402126](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-6b-200k-dpo/blob/main/results_2023-12-04T16-10-17.402126.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274780891690785,\n \"acc_stderr\": 0.03214198982171106,\n \"acc_norm\": 0.6382309545732996,\n \"acc_norm_stderr\": 0.03286487964348697,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39505119453924914,\n \"acc_stderr\": 0.014285898292938172,\n \"acc_norm\": 0.4308873720136519,\n \"acc_norm_stderr\": 0.014471133392642475\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5570603465445131,\n \"acc_stderr\": 0.004957182635381807,\n \"acc_norm\": 0.7452698665604461,\n \"acc_norm_stderr\": 0.004348189459336535\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835362,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835362\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361009,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361009\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407004,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407004\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n \"acc_stderr\": 0.012764981829524265,\n \"acc_norm\": 0.48565840938722293,\n \"acc_norm_stderr\": 0.012764981829524265\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02902942281568139,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02902942281568139\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268736\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \"acc_stderr\": 0.008744810131034042\n }\n}\n```", "repo_url": "https://huggingface.co/chinoll/Yi-6b-200k-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-17.402126.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["**/details_harness|winogrande|5_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-10-17.402126.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_10_17.402126", "path": ["results_2023-12-04T16-10-17.402126.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-10-17.402126.parquet"]}]}]}
2023-12-04T16:13:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of chinoll/Yi-6b-200k-dpo ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model chinoll/Yi-6b-200k-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:10:17.402126(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of chinoll/Yi-6b-200k-dpo", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-6b-200k-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:17.402126(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of chinoll/Yi-6b-200k-dpo", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-6b-200k-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:17.402126(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chinoll/Yi-6b-200k-dpo## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-6b-200k-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:17.402126(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
613bb47aa41c13d5286cc50c6319242af7c6d1dd
# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/chinoll/Yi-7b-dpo - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [chinoll/Yi-7b-dpo](https://huggingface.co/chinoll/Yi-7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chinoll__Yi-7b-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:10:38.355372](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-7b-dpo/blob/main/results_2023-12-04T16-10-38.355372.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6274780891690785, "acc_stderr": 0.03214198982171106, "acc_norm": 0.6382309545732996, "acc_norm_stderr": 0.03286487964348697, "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.4551491788416383, "mc2_stderr": 0.014826375266749701 }, "harness|arc:challenge|25": { "acc": 0.39505119453924914, "acc_stderr": 0.014285898292938172, "acc_norm": 0.4308873720136519, "acc_norm_stderr": 0.014471133392642475 }, "harness|hellaswag|10": { "acc": 0.5570603465445131, "acc_stderr": 0.004957182635381807, "acc_norm": 0.7452698665604461, "acc_norm_stderr": 0.004348189459336535 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137282, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137282 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322666, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322666 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.04576665403207762, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.04576665403207762 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.03196758697835362, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.03196758697835362 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47354497354497355, "acc_stderr": 0.02571523981134676, "acc_norm": 0.47354497354497355, "acc_norm_stderr": 0.02571523981134676 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361009, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361009 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.02503387058301518, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.02503387058301518 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6487179487179487, "acc_stderr": 0.024203665177902803, "acc_norm": 0.6487179487179487, "acc_norm_stderr": 0.024203665177902803 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7605042016806722, "acc_stderr": 0.027722065493361276, "acc_norm": 0.7605042016806722, "acc_norm_stderr": 0.027722065493361276 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.0399552400768168, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.0399552400768168 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8385321100917431, "acc_stderr": 0.01577623925616323, "acc_norm": 0.8385321100917431, "acc_norm_stderr": 0.01577623925616323 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5370370370370371, "acc_stderr": 0.03400603625538272, "acc_norm": 0.5370370370370371, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7794117647058824, "acc_stderr": 0.02910225438967409, "acc_norm": 0.7794117647058824, "acc_norm_stderr": 0.02910225438967409 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.0318114974705536, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794086, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794086 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.038260763248848646, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.038260763248848646 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.39285714285714285, "acc_stderr": 0.04635550135609976, "acc_norm": 0.39285714285714285, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.02126271940040697, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.02126271940040697 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8109833971902938, "acc_stderr": 0.014000791294407004, "acc_norm": 0.8109833971902938, "acc_norm_stderr": 0.014000791294407004 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3854748603351955, "acc_stderr": 0.016277927039638193, "acc_norm": 0.3854748603351955, "acc_norm_stderr": 0.016277927039638193 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667878, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667878 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7331189710610932, "acc_stderr": 0.025122637608816657, "acc_norm": 0.7331189710610932, "acc_norm_stderr": 0.025122637608816657 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.48565840938722293, "acc_stderr": 0.012764981829524265, "acc_norm": 0.48565840938722293, "acc_norm_stderr": 0.012764981829524265 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6470588235294118, "acc_stderr": 0.02902942281568139, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.02902942281568139 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.01897542792050721, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.01897542792050721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.02752963744017493, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.02752963744017493 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.035887028128263686, "acc_norm": 0.85, "acc_norm_stderr": 0.035887028128263686 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.3047735618115055, "mc1_stderr": 0.016114124156882455, "mc2": 0.4551491788416383, "mc2_stderr": 0.014826375266749701 }, "harness|winogrande|5": { "acc": 0.7308602999210734, "acc_stderr": 0.012464911951268736 }, "harness|gsm8k|5": { "acc": 0.11372251705837756, "acc_stderr": 0.008744810131034042 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_chinoll__Yi-7b-dpo
[ "region:us" ]
2023-12-04T16:13:23+00:00
{"pretty_name": "Evaluation run of chinoll/Yi-7b-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [chinoll/Yi-7b-dpo](https://huggingface.co/chinoll/Yi-7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chinoll__Yi-7b-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:10:38.355372](https://huggingface.co/datasets/open-llm-leaderboard/details_chinoll__Yi-7b-dpo/blob/main/results_2023-12-04T16-10-38.355372.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274780891690785,\n \"acc_stderr\": 0.03214198982171106,\n \"acc_norm\": 0.6382309545732996,\n \"acc_norm_stderr\": 0.03286487964348697,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39505119453924914,\n \"acc_stderr\": 0.014285898292938172,\n \"acc_norm\": 0.4308873720136519,\n \"acc_norm_stderr\": 0.014471133392642475\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5570603465445131,\n \"acc_stderr\": 0.004957182635381807,\n \"acc_norm\": 0.7452698665604461,\n \"acc_norm_stderr\": 0.004348189459336535\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835362,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835362\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361009,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361009\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407004,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407004\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n \"acc_stderr\": 0.012764981829524265,\n \"acc_norm\": 0.48565840938722293,\n \"acc_norm_stderr\": 0.012764981829524265\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02902942281568139,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02902942281568139\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4551491788416383,\n \"mc2_stderr\": 0.014826375266749701\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268736\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \"acc_stderr\": 0.008744810131034042\n }\n}\n```", "repo_url": "https://huggingface.co/chinoll/Yi-7b-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["**/details_harness|winogrande|5_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-10-38.355372.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_10_38.355372", "path": ["results_2023-12-04T16-10-38.355372.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-10-38.355372.parquet"]}]}]}
2023-12-04T16:14:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model chinoll/Yi-7b-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:10:38.355372(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:38.355372(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:38.355372(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chinoll/Yi-7b-dpo## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chinoll/Yi-7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:10:38.355372(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
779a05ed79f60cfe743e963fd0a26ae2832e4062
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v1-fp16](https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:23:37.533105](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16/blob/main/results_2023-12-04T16-23-37.533105.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6356711503628629, "acc_stderr": 0.03264369072727708, "acc_norm": 0.6379873148773121, "acc_norm_stderr": 0.03330588124087063, "mc1": 0.47368421052631576, "mc1_stderr": 0.017479241161975526, "mc2": 0.632786784829325, "mc2_stderr": 0.015062396850296454 }, "harness|arc:challenge|25": { "acc": 0.6484641638225256, "acc_stderr": 0.01395241369960094, "acc_norm": 0.6843003412969283, "acc_norm_stderr": 0.013582571095815291 }, "harness|hellaswag|10": { "acc": 0.6598287193786099, "acc_stderr": 0.0047279834341954945, "acc_norm": 0.8542123083051185, "acc_norm_stderr": 0.0035217202839105555 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.04408440022768079, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595853, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595853 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.028727502957880267, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.028727502957880267 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6042553191489362, "acc_stderr": 0.03196758697835363, "acc_norm": 0.6042553191489362, "acc_norm_stderr": 0.03196758697835363 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.025107425481137282, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.025107425481137282 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782655, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782655 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198896, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198896 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887037, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887037 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092437, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092437 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5462962962962963, "acc_stderr": 0.033953227263757976, "acc_norm": 0.5462962962962963, "acc_norm_stderr": 0.033953227263757976 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588667, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588667 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601457, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.02336505149175372, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.02336505149175372 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8199233716475096, "acc_stderr": 0.013740797258579832, "acc_norm": 0.8199233716475096, "acc_norm_stderr": 0.013740797258579832 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.024547617794803828, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.024547617794803828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38212290502793295, "acc_stderr": 0.016251139711570776, "acc_norm": 0.38212290502793295, "acc_norm_stderr": 0.016251139711570776 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.02617390850671858, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.02617390850671858 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881876, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881876 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.024748624490537365, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.024748624490537365 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4426336375488918, "acc_stderr": 0.012685906538206244, "acc_norm": 0.4426336375488918, "acc_norm_stderr": 0.012685906538206244 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983576, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983576 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223974, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223974 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169146, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169146 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.47368421052631576, "mc1_stderr": 0.017479241161975526, "mc2": 0.632786784829325, "mc2_stderr": 0.015062396850296454 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.01094187795567621 }, "harness|gsm8k|5": { "acc": 0.5511751326762699, "acc_stderr": 0.013700157442788071 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16
[ "region:us" ]
2023-12-04T16:26:28+00:00
{"pretty_name": "Evaluation run of fblgit/una-cybertron-7b-v1-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v1-fp16](https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:23:37.533105](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v1-fp16/blob/main/results_2023-12-04T16-23-37.533105.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6356711503628629,\n \"acc_stderr\": 0.03264369072727708,\n \"acc_norm\": 0.6379873148773121,\n \"acc_norm_stderr\": 0.03330588124087063,\n \"mc1\": 0.47368421052631576,\n \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.632786784829325,\n \"mc2_stderr\": 0.015062396850296454\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.01395241369960094,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6598287193786099,\n \"acc_stderr\": 0.0047279834341954945,\n \"acc_norm\": 0.8542123083051185,\n \"acc_norm_stderr\": 0.0035217202839105555\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601457,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601457\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570776,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570776\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881876,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881876\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537365,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223974,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223974\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.632786784829325,\n \"mc2_stderr\": 0.015062396850296454\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5511751326762699,\n \"acc_stderr\": 0.013700157442788071\n }\n}\n```", "repo_url": "https://huggingface.co/fblgit/una-cybertron-7b-v1-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["**/details_harness|winogrande|5_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-23-37.533105.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_23_37.533105", "path": ["results_2023-12-04T16-23-37.533105.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-23-37.533105.parquet"]}]}]}
2023-12-04T16:27:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v1-fp16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:23:37.533105(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:23:37.533105(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:23:37.533105(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 26, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v1-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:23:37.533105(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cf887506b4a656fa2dc19e5e82110eb43d50570d
# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [xxyyy123/Mistral7B_adaptor_v1](https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:24:21.549046](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1/blob/main/results_2023-12-04T16-24-21.549046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6337577170628004, "acc_stderr": 0.0323628770141504, "acc_norm": 0.6389509106050781, "acc_norm_stderr": 0.033013052940440775, "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756544, "mc2": 0.4976810372450733, "mc2_stderr": 0.01504045830849688 }, "harness|arc:challenge|25": { "acc": 0.5895904436860068, "acc_stderr": 0.014374922192642664, "acc_norm": 0.6296928327645052, "acc_norm_stderr": 0.01411129875167495 }, "harness|hellaswag|10": { "acc": 0.6352320254929297, "acc_stderr": 0.004803812631994955, "acc_norm": 0.8380800637323242, "acc_norm_stderr": 0.0036762448867232646 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.0446196043338474, "acc_norm": 0.27, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6513157894736842, "acc_stderr": 0.0387813988879761, "acc_norm": 0.6513157894736842, "acc_norm_stderr": 0.0387813988879761 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6416184971098265, "acc_stderr": 0.03656343653353159, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.03656343653353159 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082637, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082637 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.02519710107424649, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.02519710107424649 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.024362599693031096, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.024362599693031096 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5369458128078818, "acc_stderr": 0.035083705204426656, "acc_norm": 0.5369458128078818, "acc_norm_stderr": 0.035083705204426656 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.02886977846026705, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.02886977846026705 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121427, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121427 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6538461538461539, "acc_stderr": 0.024121125416941197, "acc_norm": 0.6538461538461539, "acc_norm_stderr": 0.024121125416941197 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.03017680828897434, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.03017680828897434 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3973509933774834, "acc_stderr": 0.039955240076816806, "acc_norm": 0.3973509933774834, "acc_norm_stderr": 0.039955240076816806 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8311926605504587, "acc_stderr": 0.01606005626853036, "acc_norm": 0.8311926605504587, "acc_norm_stderr": 0.01606005626853036 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.02759917430064076, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.02759917430064076 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7637130801687764, "acc_stderr": 0.027652153144159256, "acc_norm": 0.7637130801687764, "acc_norm_stderr": 0.027652153144159256 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.0364129708131373, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7975460122699386, "acc_stderr": 0.03157065078911901, "acc_norm": 0.7975460122699386, "acc_norm_stderr": 0.03157065078911901 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597524, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597524 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8199233716475096, "acc_stderr": 0.013740797258579828, "acc_norm": 0.8199233716475096, "acc_norm_stderr": 0.013740797258579828 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.024257901705323378, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.024257901705323378 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31843575418994413, "acc_stderr": 0.015581008080360276, "acc_norm": 0.31843575418994413, "acc_norm_stderr": 0.015581008080360276 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.761437908496732, "acc_stderr": 0.024404394928087873, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.024404394928087873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7253086419753086, "acc_stderr": 0.02483605786829468, "acc_norm": 0.7253086419753086, "acc_norm_stderr": 0.02483605786829468 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44328552803129073, "acc_stderr": 0.01268781841959992, "acc_norm": 0.44328552803129073, "acc_norm_stderr": 0.01268781841959992 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406752, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406752 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.01897542792050721, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.01897542792050721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291296, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482707, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482707 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.03061111655743253, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.03061111655743253 }, "harness|truthfulqa:mc|0": { "mc1": 0.3525091799265606, "mc1_stderr": 0.016724646380756544, "mc2": 0.4976810372450733, "mc2_stderr": 0.01504045830849688 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987729 }, "harness|gsm8k|5": { "acc": 0.41243366186504926, "acc_stderr": 0.013559628790941452 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1
[ "region:us" ]
2023-12-04T16:27:12+00:00
{"pretty_name": "Evaluation run of xxyyy123/Mistral7B_adaptor_v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/Mistral7B_adaptor_v1](https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:24:21.549046](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1/blob/main/results_2023-12-04T16-24-21.549046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6337577170628004,\n \"acc_stderr\": 0.0323628770141504,\n \"acc_norm\": 0.6389509106050781,\n \"acc_norm_stderr\": 0.033013052940440775,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.4976810372450733,\n \"mc2_stderr\": 0.01504045830849688\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n \"acc_stderr\": 0.004803812631994955,\n \"acc_norm\": 0.8380800637323242,\n \"acc_norm_stderr\": 0.0036762448867232646\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853036,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853036\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087873,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.02483605786829468,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.02483605786829468\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.4976810372450733,\n \"mc2_stderr\": 0.01504045830849688\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41243366186504926,\n \"acc_stderr\": 0.013559628790941452\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["**/details_harness|winogrande|5_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-24-21.549046.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_24_21.549046", "path": ["results_2023-12-04T16-24-21.549046.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-24-21.549046.parquet"]}]}]}
2023-12-04T16:27:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model xxyyy123/Mistral7B_adaptor_v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:24:21.549046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral7B_adaptor_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:24:21.549046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral7B_adaptor_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:24:21.549046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/Mistral7B_adaptor_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:24:21.549046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
e548aeb66f3cc9a1fd7cc121800fdcf3af33c7d7
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v2-bf16](https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:28:35.097444](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16/blob/main/results_2023-12-04T16-28-35.097444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6349296405755961, "acc_stderr": 0.03261211472247009, "acc_norm": 0.6370258261406261, "acc_norm_stderr": 0.03327308531523366, "mc1": 0.48714810281517745, "mc1_stderr": 0.017497717944299825, "mc2": 0.646322826116642, "mc2_stderr": 0.015041829082644448 }, "harness|arc:challenge|25": { "acc": 0.6552901023890785, "acc_stderr": 0.01388881628678211, "acc_norm": 0.6825938566552902, "acc_norm_stderr": 0.013602239088038167 }, "harness|hellaswag|10": { "acc": 0.6717785301732723, "acc_stderr": 0.004686062421158145, "acc_norm": 0.8584943238398726, "acc_norm_stderr": 0.0034783009945146925 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.04292346959909283, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595853, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595853 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.037455547914624555, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.037455547914624555 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5914893617021276, "acc_stderr": 0.032134180267015755, "acc_norm": 0.5914893617021276, "acc_norm_stderr": 0.032134180267015755 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.02525303255499769, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.02525303255499769 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7645161290322581, "acc_stderr": 0.024137632429337714, "acc_norm": 0.7645161290322581, "acc_norm_stderr": 0.024137632429337714 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.02840653309060846, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.02840653309060846 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887037, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887037 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.01584825580650155, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.01584825580650155 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676177, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676177 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516302, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516302 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.034878251684978906, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.034878251684978906 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822584, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822584 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841403, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841403 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8160919540229885, "acc_stderr": 0.013853724170922534, "acc_norm": 0.8160919540229885, "acc_norm_stderr": 0.013853724170922534 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6994219653179191, "acc_stderr": 0.024685316867257803, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.024685316867257803 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3787709497206704, "acc_stderr": 0.01622353351036511, "acc_norm": 0.3787709497206704, "acc_norm_stderr": 0.01622353351036511 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.02671611838015685, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.02671611838015685 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7222222222222222, "acc_stderr": 0.02492200116888633, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.02492200116888633 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4452411994784876, "acc_stderr": 0.012693421303973294, "acc_norm": 0.4452411994784876, "acc_norm_stderr": 0.012693421303973294 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6544117647058824, "acc_stderr": 0.028888193103988633, "acc_norm": 0.6544117647058824, "acc_norm_stderr": 0.028888193103988633 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6503267973856209, "acc_stderr": 0.01929196189506638, "acc_norm": 0.6503267973856209, "acc_norm_stderr": 0.01929196189506638 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.029279567411065677, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.029279567411065677 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578327, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7953216374269005, "acc_stderr": 0.03094445977853321, "acc_norm": 0.7953216374269005, "acc_norm_stderr": 0.03094445977853321 }, "harness|truthfulqa:mc|0": { "mc1": 0.48714810281517745, "mc1_stderr": 0.017497717944299825, "mc2": 0.646322826116642, "mc2_stderr": 0.015041829082644448 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.01103033579861744 }, "harness|gsm8k|5": { "acc": 0.5504169825625473, "acc_stderr": 0.013702290047884747 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16
[ "region:us" ]
2023-12-04T16:31:29+00:00
{"pretty_name": "Evaluation run of fblgit/una-cybertron-7b-v2-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v2-bf16](https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:28:35.097444](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v2-bf16/blob/main/results_2023-12-04T16-28-35.097444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6349296405755961,\n \"acc_stderr\": 0.03261211472247009,\n \"acc_norm\": 0.6370258261406261,\n \"acc_norm_stderr\": 0.03327308531523366,\n \"mc1\": 0.48714810281517745,\n \"mc1_stderr\": 0.017497717944299825,\n \"mc2\": 0.646322826116642,\n \"mc2_stderr\": 0.015041829082644448\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6717785301732723,\n \"acc_stderr\": 0.004686062421158145,\n \"acc_norm\": 0.8584943238398726,\n \"acc_norm_stderr\": 0.0034783009945146925\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.024137632429337714,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.024137632429337714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922534,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922534\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n \"acc_stderr\": 0.01622353351036511,\n \"acc_norm\": 0.3787709497206704,\n \"acc_norm_stderr\": 0.01622353351036511\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48714810281517745,\n \"mc1_stderr\": 0.017497717944299825,\n \"mc2\": 0.646322826116642,\n \"mc2_stderr\": 0.015041829082644448\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5504169825625473,\n \"acc_stderr\": 0.013702290047884747\n }\n}\n```", "repo_url": "https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["**/details_harness|winogrande|5_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-28-35.097444.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_28_35.097444", "path": ["results_2023-12-04T16-28-35.097444.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-28-35.097444.parquet"]}]}]}
2023-12-04T16:32:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v2-bf16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:28:35.097444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v2-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:28:35.097444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v2-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:28:35.097444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v2-bf16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model fblgit/una-cybertron-7b-v2-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:28:35.097444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f1a5849e809f436ffbd8ecc33d8279318d1a74eb
# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:32:38.493956](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3/blob/main/results_2023-12-04T16-32-38.493956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.613382583651225, "acc_stderr": 0.03275222858972369, "acc_norm": 0.6188052812995584, "acc_norm_stderr": 0.03341845889809012, "mc1": 0.3219094247246022, "mc1_stderr": 0.016355567611960397, "mc2": 0.48209769794385393, "mc2_stderr": 0.014975113215989893 }, "harness|arc:challenge|25": { "acc": 0.5665529010238908, "acc_stderr": 0.014481376224558902, "acc_norm": 0.60580204778157, "acc_norm_stderr": 0.01428052266746732 }, "harness|hellaswag|10": { "acc": 0.6404102768372834, "acc_stderr": 0.004788994060654277, "acc_norm": 0.8333997211710814, "acc_norm_stderr": 0.003718570792719566 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.042320736951515885, "acc_norm": 0.6, "acc_norm_stderr": 0.042320736951515885 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6644736842105263, "acc_stderr": 0.038424985593952694, "acc_norm": 0.6644736842105263, "acc_norm_stderr": 0.038424985593952694 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6452830188679245, "acc_stderr": 0.02944517532819959, "acc_norm": 0.6452830188679245, "acc_norm_stderr": 0.02944517532819959 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077636, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.03265019475033582, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42063492063492064, "acc_stderr": 0.025424835086924, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.025424835086924 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.043758884927270605, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.043758884927270605 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7193548387096774, "acc_stderr": 0.025560604721022884, "acc_norm": 0.7193548387096774, "acc_norm_stderr": 0.025560604721022884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479049, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479049 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8652849740932642, "acc_stderr": 0.02463978909770944, "acc_norm": 0.8652849740932642, "acc_norm_stderr": 0.02463978909770944 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5923076923076923, "acc_stderr": 0.02491524398598785, "acc_norm": 0.5923076923076923, "acc_norm_stderr": 0.02491524398598785 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608452, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608452 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6134453781512605, "acc_stderr": 0.03163145807552378, "acc_norm": 0.6134453781512605, "acc_norm_stderr": 0.03163145807552378 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.40397350993377484, "acc_stderr": 0.040064856853653415, "acc_norm": 0.40397350993377484, "acc_norm_stderr": 0.040064856853653415 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.016659279700295845, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.016659279700295845 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640763, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640763 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6591928251121076, "acc_stderr": 0.0318114974705536, "acc_norm": 0.6591928251121076, "acc_norm_stderr": 0.0318114974705536 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6625766871165644, "acc_stderr": 0.03714908409935573, "acc_norm": 0.6625766871165644, "acc_norm_stderr": 0.03714908409935573 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8504273504273504, "acc_stderr": 0.023365051491753715, "acc_norm": 0.8504273504273504, "acc_norm_stderr": 0.023365051491753715 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8071519795657727, "acc_stderr": 0.014108533515757431, "acc_norm": 0.8071519795657727, "acc_norm_stderr": 0.014108533515757431 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.661849710982659, "acc_stderr": 0.025469770149400175, "acc_norm": 0.661849710982659, "acc_norm_stderr": 0.025469770149400175 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3016759776536313, "acc_stderr": 0.015350767572220286, "acc_norm": 0.3016759776536313, "acc_norm_stderr": 0.015350767572220286 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6975308641975309, "acc_stderr": 0.025557653981868052, "acc_norm": 0.6975308641975309, "acc_norm_stderr": 0.025557653981868052 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45436766623207303, "acc_stderr": 0.012716941720734802, "acc_norm": 0.45436766623207303, "acc_norm_stderr": 0.012716941720734802 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6360294117647058, "acc_stderr": 0.029227192460032025, "acc_norm": 0.6360294117647058, "acc_norm_stderr": 0.029227192460032025 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.019270998708223977, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.019270998708223977 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.027686913588013024, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.027686913588013024 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.3219094247246022, "mc1_stderr": 0.016355567611960397, "mc2": 0.48209769794385393, "mc2_stderr": 0.014975113215989893 }, "harness|winogrande|5": { "acc": 0.7774269928966061, "acc_stderr": 0.011690933809712669 }, "harness|gsm8k|5": { "acc": 0.3616376042456406, "acc_stderr": 0.013234658351088774 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3
[ "region:us" ]
2023-12-04T16:35:27+00:00
{"pretty_name": "Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:32:38.493956](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3/blob/main/results_2023-12-04T16-32-38.493956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.613382583651225,\n \"acc_stderr\": 0.03275222858972369,\n \"acc_norm\": 0.6188052812995584,\n \"acc_norm_stderr\": 0.03341845889809012,\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48209769794385393,\n \"mc2_stderr\": 0.014975113215989893\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.01428052266746732\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6404102768372834,\n \"acc_stderr\": 0.004788994060654277,\n \"acc_norm\": 0.8333997211710814,\n \"acc_norm_stderr\": 0.003718570792719566\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022884,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608452,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608452\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295845,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295845\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400175,\n \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400175\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n \"acc_stderr\": 0.012716941720734802,\n \"acc_norm\": 0.45436766623207303,\n \"acc_norm_stderr\": 0.012716941720734802\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960397,\n \"mc2\": 0.48209769794385393,\n \"mc2_stderr\": 0.014975113215989893\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712669\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3616376042456406,\n \"acc_stderr\": 0.013234658351088774\n }\n}\n```", "repo_url": "https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["**/details_harness|winogrande|5_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-32-38.493956.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_32_38.493956", "path": ["results_2023-12-04T16-32-38.493956.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-32-38.493956.parquet"]}]}]}
2023-12-04T16:36:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:32:38.493956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:32:38.493956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:32:38.493956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 37, 31, 186, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:32:38.493956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
84e2d6c1b999fe7dcbe394b838fabf7e0d77952f
Downloaded Nov 25th 2023 from ZINC-20 (https://zinc20.docking.org/). Contains the unique canonicalized SMILES molecules, done with RDKit. Used this filtering criteria in the ZINC-20 tranches: - 2D representation - Standard reaction - Annotated purchasability - Drug-like subset Distribution of tokenized SMILES sequence lengths below, along with the top 15 SMILES tokens (with their frequencies). The following regex string was used to split the SMILES molecule into tokens: (\[[^\]]+]|Br?|Cl?|N|O|S|P|F|I|b|c|n|o|s|p|\(|\)|\.|=|#|-|\+|\\|\/|:|~|@|\?|>>?|\*|\$|\%[0-9]{2}|[0-9]) ![Distribution of Sequence Lengths](sequence_lengths_distribution.png) ![Top 15 SMILES Tokens and Their Frequencies](top_smiles_tokens.png)
Pixelatory/ZINC20-Druglike
[ "size_categories:1B<n<10B", "chemistry", "region:us" ]
2023-12-04T16:37:33+00:00
{"size_categories": ["1B<n<10B"], "tags": ["chemistry"], "configs": [{"config_name": "default", "data_files": "zinc-druglike-cano.tar.xz"}]}
2023-12-06T04:00:57+00:00
[]
[]
TAGS #size_categories-1B<n<10B #chemistry #region-us
Downloaded Nov 25th 2023 from ZINC-20 (URL Contains the unique canonicalized SMILES molecules, done with RDKit. Used this filtering criteria in the ZINC-20 tranches: - 2D representation - Standard reaction - Annotated purchasability - Drug-like subset Distribution of tokenized SMILES sequence lengths below, along with the top 15 SMILES tokens (with their frequencies). The following regex string was used to split the SMILES molecule into tokens: (\[[^\]]+]|Br?|Cl?|N|O|S|P|F|I|b|c|n|o|s|p|\(|\)|\.|=|#|-|\+|\\|\/|:|~|@|\?|>>?|\*|\$|\%[0-9]{2}|[0-9]) !Distribution of Sequence Lengths !Top 15 SMILES Tokens and Their Frequencies
[]
[ "TAGS\n#size_categories-1B<n<10B #chemistry #region-us \n" ]
[ 22 ]
[ "passage: TAGS\n#size_categories-1B<n<10B #chemistry #region-us \n" ]
da10aae44140ac618c9cfcd96572555f6797a4d1
# Dataset Card This dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete. It is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. This dataset has a smaller version [here](https://huggingface.co/datasets/mt0rm0/movie_descriptors_small). ## Dataset Details ### Dataset Description The dataset has 44435 rows and 3 columns: - 'name': includes the title of the movies - 'release_year': indicates the year of release - 'overview': provides a brief description of each movie, used for advertisement. **Curated by:** [Mario Tormo Romero](https://huggingface.co/mt0rm0) **Language(s) (NLP):** English **License:** cc0-1.0 ### Dataset Sources This Dataset is a subset of Kaggle's [The Movie Dataset](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset). We have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete. The original Dataset has a cc0-1.0 License and we have maintained it. ## Uses This is a toy dataset created for pegagogical purposes, and is used in the **Working with embeddings** Workshop created and organized by the [AI Service Center Berlin-Brandenburg](https://hpi.de/kisz/) at the [Hasso Plattner Institute](https://hpi.de/). ## Dataset Creation ### Curation Rationale We want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information. ### Source Data Our source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017. #### Data Collection and Processing The data was downloaded from [Kaggle](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset) as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted. The data was processed with the following code: ```python import pandas as pd # load the csv file df = pd.read_csv("movies_metadata.csv", low_memory=False) # select the required columns, drop rows with missing values and # reset the index df = df.loc[:, ['title', 'release_date', 'overview']] df = df.dropna(axis=0).reset_index(drop=True) # make a new column with the release year df.loc[:, 'release_year'] = pd.to_datetime(df.release_date).dt.year # select the columns in the desired order df = df.loc[:, ['title', 'release_year', 'overview']] # save the data to parquet df.to_parquet('descriptors_data.parquet') ``` #### Who are the source data producers? This dataset is an ensemble of data collected by [Rounak Banik](https://www.kaggle.com/rounakbanik) from TMDB and GroupLens. In particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb.
mt0rm0/movie_descriptors
[ "task_categories:sentence-similarity", "size_categories:10K<n<100K", "language:en", "license:cc0-1.0", "movies", "embeddings", "semantic search", "films", "hpi", "workshop", "region:us" ]
2023-12-04T16:44:25+00:00
{"language": ["en"], "license": "cc0-1.0", "size_categories": ["10K<n<100K"], "task_categories": ["sentence-similarity"], "pretty_name": "\"Movie descriptors for Semantic Search\"", "tags": ["movies", "embeddings", "semantic search", "films", "hpi", "workshop"]}
2023-12-04T16:46:54+00:00
[]
[ "en" ]
TAGS #task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us
# Dataset Card This dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete. It is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. This dataset has a smaller version here. ## Dataset Details ### Dataset Description The dataset has 44435 rows and 3 columns: - 'name': includes the title of the movies - 'release_year': indicates the year of release - 'overview': provides a brief description of each movie, used for advertisement. Curated by: Mario Tormo Romero Language(s) (NLP): English License: cc0-1.0 ### Dataset Sources This Dataset is a subset of Kaggle's The Movie Dataset. We have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete. The original Dataset has a cc0-1.0 License and we have maintained it. ## Uses This is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute. ## Dataset Creation ### Curation Rationale We want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information. ### Source Data Our source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017. #### Data Collection and Processing The data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted. The data was processed with the following code: #### Who are the source data producers? This dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens. In particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb.
[ "# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a smaller version here.", "## Dataset Details", "### Dataset Description\n\nThe dataset has 44435 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nCurated by: Mario Tormo Romero\n\nLanguage(s) (NLP): English\n\nLicense: cc0-1.0", "### Dataset Sources\n\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.", "## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.", "## Dataset Creation", "### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information.", "### Source Data\n\nOur source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017.", "#### Data Collection and Processing\n\nThe data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted.\n\nThe data was processed with the following code:", "#### Who are the source data producers?\n\nThis dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens.\nIn particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb." ]
[ "TAGS\n#task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us \n", "# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a smaller version here.", "## Dataset Details", "### Dataset Description\n\nThe dataset has 44435 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nCurated by: Mario Tormo Romero\n\nLanguage(s) (NLP): English\n\nLicense: cc0-1.0", "### Dataset Sources\n\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.", "## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.", "## Dataset Creation", "### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information.", "### Source Data\n\nOur source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017.", "#### Data Collection and Processing\n\nThe data was downloaded from Kaggle as a zip file. The file <kbd>movies_metadata.csv</kbd> was then extracted.\n\nThe data was processed with the following code:", "#### Who are the source data producers?\n\nThis dataset is an ensemble of data collected by Rounak Banik from TMDB and GroupLens.\nIn particular, the movies metadata has been collected from the TMDB Open API, but the source dataset is not endorsed or certified by TMDb." ]
[ 63, 90, 4, 87, 89, 54, 5, 43, 42, 53, 69 ]
[ "passage: TAGS\n#task_categories-sentence-similarity #size_categories-10K<n<100K #language-English #license-cc0-1.0 #movies #embeddings #semantic search #films #hpi #workshop #region-us \n# Dataset Card\n\nThis dataset is a subset from Kaggle's The Movie Dataset that contains only name, release year and overview for every film in the original dataset that has that information complete.\nIt is intended as a toy dataset for learning about embeddings in a workshop from the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.\n\nThis dataset has a smaller version here.## Dataset Details### Dataset Description\n\nThe dataset has 44435 rows and 3 columns:\n\n- 'name': includes the title of the movies\n- 'release_year': indicates the year of release\n- 'overview': provides a brief description of each movie, used for advertisement.\n\nCurated by: Mario Tormo Romero\n\nLanguage(s) (NLP): English\n\nLicense: cc0-1.0### Dataset Sources\n\nThis Dataset is a subset of Kaggle's The Movie Dataset. \nWe have only used the <kbd>movies_metadata.csv</kbd> file and extracted some features (see Dataset Description) and dropped the rows that didn't were complete.\n\nThe original Dataset has a cc0-1.0 License and we have maintained it.## Uses\n\nThis is a toy dataset created for pegagogical purposes, and is used in the Working with embeddings Workshop created and organized by the AI Service Center Berlin-Brandenburg at the Hasso Plattner Institute.## Dataset Creation### Curation Rationale\n\nWe want to provide with this dataset a fast way of obtaining the required data for our workshops without having to download huge datasets that contain just way too much information.### Source Data\n\nOur source is Kaggle's The Movie Dataset., so the information comes from the MovieLens Dataset. The dataset consists of movies released on or before July 2017." ]
53d0512f248d1e0a4791d29140d7140c978b0197
# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/uukuguy/Orca-2-13b-f16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [uukuguy/Orca-2-13b-f16](https://huggingface.co/uukuguy/Orca-2-13b-f16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:43:12.398370](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16/blob/main/results_2023-12-04T16-43-12.398370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6024963144778468, "acc_stderr": 0.03292700891541927, "acc_norm": 0.6070525664063983, "acc_norm_stderr": 0.03359636787928049, "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5641081747684346, "mc2_stderr": 0.015927666604862285 }, "harness|arc:challenge|25": { "acc": 0.5733788395904437, "acc_stderr": 0.014453185592920293, "acc_norm": 0.606655290102389, "acc_norm_stderr": 0.014275101465693024 }, "harness|hellaswag|10": { "acc": 0.6115315674168492, "acc_stderr": 0.004864058877626273, "acc_norm": 0.7981477793268273, "acc_norm_stderr": 0.004005621755121483 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.743421052631579, "acc_stderr": 0.03554180368025689, "acc_norm": 0.743421052631579, "acc_norm_stderr": 0.03554180368025689 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6264150943396226, "acc_stderr": 0.029773082713319875, "acc_norm": 0.6264150943396226, "acc_norm_stderr": 0.029773082713319875 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283648, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283648 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.032436186361081004, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.032436186361081004 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.02494236893115979, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.02494236893115979 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7387096774193549, "acc_stderr": 0.02499305339776481, "acc_norm": 0.7387096774193549, "acc_norm_stderr": 0.02499305339776481 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03477691162163659, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7373737373737373, "acc_stderr": 0.03135305009533086, "acc_norm": 0.7373737373737373, "acc_norm_stderr": 0.03135305009533086 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.024864995159767762, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.024864995159767762 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683515, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683515 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200144, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200144 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8143459915611815, "acc_stderr": 0.025310495376944856, "acc_norm": 0.8143459915611815, "acc_norm_stderr": 0.025310495376944856 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.031602951437766785, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384493, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384493 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077802, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077802 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.776500638569604, "acc_stderr": 0.01489723522945071, "acc_norm": 0.776500638569604, "acc_norm_stderr": 0.01489723522945071 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647897, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3039106145251397, "acc_stderr": 0.015382845587584518, "acc_norm": 0.3039106145251397, "acc_norm_stderr": 0.015382845587584518 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.025329888171900933, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.025329888171900933 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.029700453247291484, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.029700453247291484 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4361147327249022, "acc_stderr": 0.012665568135455335, "acc_norm": 0.4361147327249022, "acc_norm_stderr": 0.012665568135455335 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5772058823529411, "acc_stderr": 0.030008562845003476, "acc_norm": 0.5772058823529411, "acc_norm_stderr": 0.030008562845003476 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6045751633986928, "acc_stderr": 0.019780465954777508, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.019780465954777508 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5641081747684346, "mc2_stderr": 0.015927666604862285 }, "harness|winogrande|5": { "acc": 0.7663772691397001, "acc_stderr": 0.011892194477183525 }, "harness|gsm8k|5": { "acc": 0.38968915845337376, "acc_stderr": 0.013433123236110692 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16
[ "region:us" ]
2023-12-04T16:46:06+00:00
{"pretty_name": "Evaluation run of uukuguy/Orca-2-13b-f16", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/Orca-2-13b-f16](https://huggingface.co/uukuguy/Orca-2-13b-f16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:43:12.398370](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__Orca-2-13b-f16/blob/main/results_2023-12-04T16-43-12.398370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6024963144778468,\n \"acc_stderr\": 0.03292700891541927,\n \"acc_norm\": 0.6070525664063983,\n \"acc_norm_stderr\": 0.03359636787928049,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5641081747684346,\n \"mc2_stderr\": 0.015927666604862285\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693024\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6115315674168492,\n \"acc_stderr\": 0.004864058877626273,\n \"acc_norm\": 0.7981477793268273,\n \"acc_norm_stderr\": 0.004005621755121483\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.03554180368025689,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.03554180368025689\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767762,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767762\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584518,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584518\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900933,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900933\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455335,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455335\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5641081747684346,\n \"mc2_stderr\": 0.015927666604862285\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38968915845337376,\n \"acc_stderr\": 0.013433123236110692\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/Orca-2-13b-f16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["**/details_harness|winogrande|5_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-43-12.398370.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_43_12.398370", "path": ["results_2023-12-04T16-43-12.398370.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-43-12.398370.parquet"]}]}]}
2023-12-04T16:46:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model uukuguy/Orca-2-13b-f16 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:43:12.398370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/Orca-2-13b-f16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:43:12.398370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/Orca-2-13b-f16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:43:12.398370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 171, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/Orca-2-13b-f16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/Orca-2-13b-f16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:43:12.398370(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f628ebf24eed548e31715cfc192f4c067b095504
# Dataset Card for "test2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mariosasko/test2
[ "region:us" ]
2023-12-04T16:47:15+00:00
{"dataset_info": {"features": [{"name": "a", "dtype": "int64"}, {"name": "b", "dtype": "string"}, {"name": "c", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 49, "num_examples": 3}], "download_size": 1536, "dataset_size": 49}}
2023-12-04T16:47:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "test2" More Information needed
[ "# Dataset Card for \"test2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"test2\"\n\nMore Information needed" ]
[ 6, 12 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"test2\"\n\nMore Information needed" ]
49c431b160c868230ed47858bbeaa72df8420d1e
# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/CausalLM-Platypus-14B](https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:51:57.127322](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B/blob/main/results_2023-12-04T16-51-57.127322.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6473338331104262, "acc_stderr": 0.032360970782450774, "acc_norm": 0.6510141346905667, "acc_norm_stderr": 0.033006063375479955, "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.4756853442339182, "mc2_stderr": 0.014976872039741383 }, "harness|arc:challenge|25": { "acc": 0.5119453924914675, "acc_stderr": 0.014607220340597171, "acc_norm": 0.5691126279863481, "acc_norm_stderr": 0.014471133392642471 }, "harness|hellaswag|10": { "acc": 0.602867954590719, "acc_stderr": 0.004883037758919966, "acc_norm": 0.800637323242382, "acc_norm_stderr": 0.003987047047167319 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926604, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.03643037168958548, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.03643037168958548 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6275862068965518, "acc_stderr": 0.04028731532947558, "acc_norm": 0.6275862068965518, "acc_norm_stderr": 0.04028731532947558 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5105820105820106, "acc_stderr": 0.02574554227604548, "acc_norm": 0.5105820105820106, "acc_norm_stderr": 0.02574554227604548 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5396825396825397, "acc_stderr": 0.04458029125470973, "acc_norm": 0.5396825396825397, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001974, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7677419354838709, "acc_stderr": 0.024022256130308235, "acc_norm": 0.7677419354838709, "acc_norm_stderr": 0.024022256130308235 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5911330049261084, "acc_stderr": 0.034590588158832314, "acc_norm": 0.5911330049261084, "acc_norm_stderr": 0.034590588158832314 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8484848484848485, "acc_stderr": 0.025545650426603613, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.025545650426603613 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397436, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6512820512820513, "acc_stderr": 0.02416278028401772, "acc_norm": 0.6512820512820513, "acc_norm_stderr": 0.02416278028401772 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.0291857149498574, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.0291857149498574 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7521008403361344, "acc_stderr": 0.028047967224176892, "acc_norm": 0.7521008403361344, "acc_norm_stderr": 0.028047967224176892 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719198, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719198 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8165137614678899, "acc_stderr": 0.016595259710399296, "acc_norm": 0.8165137614678899, "acc_norm_stderr": 0.016595259710399296 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849313, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849313 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.025530100460233497, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.025530100460233497 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7085201793721974, "acc_stderr": 0.030500283176545847, "acc_norm": 0.7085201793721974, "acc_norm_stderr": 0.030500283176545847 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596914, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596914 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.035817969517092825, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.035817969517092825 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5446428571428571, "acc_stderr": 0.04726835553719098, "acc_norm": 0.5446428571428571, "acc_norm_stderr": 0.04726835553719098 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.037601780060266224, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281358, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281358 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8135376756066411, "acc_stderr": 0.013927751372001503, "acc_norm": 0.8135376756066411, "acc_norm_stderr": 0.013927751372001503 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36201117318435755, "acc_stderr": 0.016073067350153087, "acc_norm": 0.36201117318435755, "acc_norm_stderr": 0.016073067350153087 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6895424836601307, "acc_stderr": 0.026493033225145894, "acc_norm": 0.6895424836601307, "acc_norm_stderr": 0.026493033225145894 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.691358024691358, "acc_stderr": 0.025702640260603746, "acc_norm": 0.691358024691358, "acc_norm_stderr": 0.025702640260603746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.515645371577575, "acc_stderr": 0.012763982838120937, "acc_norm": 0.515645371577575, "acc_norm_stderr": 0.012763982838120937 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6360294117647058, "acc_stderr": 0.02922719246003203, "acc_norm": 0.6360294117647058, "acc_norm_stderr": 0.02922719246003203 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.019070985589687495, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.019070985589687495 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7755102040816326, "acc_stderr": 0.02671143055553842, "acc_norm": 0.7755102040816326, "acc_norm_stderr": 0.02671143055553842 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.4756853442339182, "mc2_stderr": 0.014976872039741383 }, "harness|winogrande|5": { "acc": 0.7600631412786109, "acc_stderr": 0.012002078629485737 }, "harness|gsm8k|5": { "acc": 0.5724033358605004, "acc_stderr": 0.013627322286986808 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B
[ "region:us" ]
2023-12-04T16:54:36+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/CausalLM-Platypus-14B](https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:51:57.127322](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B/blob/main/results_2023-12-04T16-51-57.127322.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473338331104262,\n \"acc_stderr\": 0.032360970782450774,\n \"acc_norm\": 0.6510141346905667,\n \"acc_norm_stderr\": 0.033006063375479955,\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4756853442339182,\n \"mc2_stderr\": 0.014976872039741383\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.014471133392642471\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.602867954590719,\n \"acc_stderr\": 0.004883037758919966,\n \"acc_norm\": 0.800637323242382,\n \"acc_norm_stderr\": 0.003987047047167319\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5911330049261084,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603613,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603613\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397436,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.0291857149498574,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.0291857149498574\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719198,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399296,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399296\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.035817969517092825,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.035817969517092825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281358,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281358\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.515645371577575,\n \"acc_stderr\": 0.012763982838120937,\n \"acc_norm\": 0.515645371577575,\n \"acc_norm_stderr\": 0.012763982838120937\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.02671143055553842,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.02671143055553842\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4756853442339182,\n \"mc2_stderr\": 0.014976872039741383\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \"acc_stderr\": 0.013627322286986808\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["**/details_harness|winogrande|5_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-51-57.127322.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_51_57.127322", "path": ["results_2023-12-04T16-51-57.127322.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-51-57.127322.parquet"]}]}]}
2023-12-04T16:55:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/CausalLM-Platypus-14B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:51:57.127322(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/CausalLM-Platypus-14B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:51:57.127322(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/CausalLM-Platypus-14B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:51:57.127322(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/CausalLM-Platypus-14B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:51:57.127322(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
1f6ded7561761c9a54f4b82c7d549b3cc9a18420
# Dataset Card for "WMT-month-splits" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
KaiNylund/WMT-month-splits
[ "license:cc0-1.0", "region:us" ]
2023-12-04T16:56:44+00:00
{"license": "cc0-1.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "2021_9_train", "num_bytes": 30033326, "num_examples": 9166}, {"name": "2021_9_test", "num_bytes": 3004775, "num_examples": 906}, {"name": "2021_10_train", "num_bytes": 30036723, "num_examples": 9056}, {"name": "2021_10_test", "num_bytes": 3005002, "num_examples": 914}, {"name": "2021_11_train", "num_bytes": 30030388, "num_examples": 9139}, {"name": "2021_11_test", "num_bytes": 3006078, "num_examples": 939}], "download_size": 0, "dataset_size": 99116292}}
2024-02-12T23:26:22+00:00
[]
[]
TAGS #license-cc0-1.0 #region-us
# Dataset Card for "WMT-month-splits" More Information needed
[ "# Dataset Card for \"WMT-month-splits\"\n\nMore Information needed" ]
[ "TAGS\n#license-cc0-1.0 #region-us \n", "# Dataset Card for \"WMT-month-splits\"\n\nMore Information needed" ]
[ 14, 19 ]
[ "passage: TAGS\n#license-cc0-1.0 #region-us \n# Dataset Card for \"WMT-month-splits\"\n\nMore Information needed" ]
92a53df4732e6ed4631691a6fbad4e0ffb0763a0
# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-26B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/vmajor/Orca2-13B-selfmerge-26B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [vmajor/Orca2-13B-selfmerge-26B](https://huggingface.co/vmajor/Orca2-13B-selfmerge-26B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-26B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T16:56:25.498405](https://huggingface.co/datasets/open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-26B/blob/main/results_2023-12-04T16-56-25.498405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6021029441684177, "acc_stderr": 0.03292834355809297, "acc_norm": 0.6066088767121881, "acc_norm_stderr": 0.033597954121191174, "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5637680270447162, "mc2_stderr": 0.01593030661874887 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650649, "acc_norm": 0.6083617747440273, "acc_norm_stderr": 0.014264122124938217 }, "harness|hellaswag|10": { "acc": 0.611929894443338, "acc_stderr": 0.004863147544177514, "acc_norm": 0.7984465245966939, "acc_norm_stderr": 0.004003405481372169 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361073, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361073 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796005, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5491329479768786, "acc_stderr": 0.03794012674697031, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.03794012674697031 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.02494236893115979, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.02494236893115979 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7387096774193549, "acc_stderr": 0.02499305339776481, "acc_norm": 0.7387096774193549, "acc_norm_stderr": 0.02499305339776481 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03477691162163659, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270285, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270285 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397447, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397447 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5948717948717949, "acc_stderr": 0.02489047176993815, "acc_norm": 0.5948717948717949, "acc_norm_stderr": 0.02489047176993815 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066475, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066475 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200144, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200144 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.031493846709941306, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.031493846709941306 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384493, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384493 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7752234993614304, "acc_stderr": 0.01492744710193716, "acc_norm": 0.7752234993614304, "acc_norm_stderr": 0.01492744710193716 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647897, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30837988826815643, "acc_stderr": 0.01544571691099888, "acc_norm": 0.30837988826815643, "acc_norm_stderr": 0.01544571691099888 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.029700453247291484, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.029700453247291484 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4361147327249022, "acc_stderr": 0.012665568135455335, "acc_norm": 0.4361147327249022, "acc_norm_stderr": 0.012665568135455335 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5772058823529411, "acc_stderr": 0.030008562845003476, "acc_norm": 0.5772058823529411, "acc_norm_stderr": 0.030008562845003476 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6045751633986928, "acc_stderr": 0.019780465954777508, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.019780465954777508 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5637680270447162, "mc2_stderr": 0.01593030661874887 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 }, "harness|gsm8k|5": { "acc": 0.39196360879454134, "acc_stderr": 0.013447140886023829 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-26B
[ "region:us" ]
2023-12-04T16:59:19+00:00
{"pretty_name": "Evaluation run of vmajor/Orca2-13B-selfmerge-26B", "dataset_summary": "Dataset automatically created during the evaluation run of model [vmajor/Orca2-13B-selfmerge-26B](https://huggingface.co/vmajor/Orca2-13B-selfmerge-26B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-26B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T16:56:25.498405](https://huggingface.co/datasets/open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-26B/blob/main/results_2023-12-04T16-56-25.498405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6021029441684177,\n \"acc_stderr\": 0.03292834355809297,\n \"acc_norm\": 0.6066088767121881,\n \"acc_norm_stderr\": 0.033597954121191174,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5637680270447162,\n \"mc2_stderr\": 0.01593030661874887\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938217\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.611929894443338,\n \"acc_stderr\": 0.004863147544177514,\n \"acc_norm\": 0.7984465245966939,\n \"acc_norm_stderr\": 0.004003405481372169\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397447,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.02489047176993815,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.02489047176993815\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193716,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193716\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.01544571691099888,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.01544571691099888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455335,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455335\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5637680270447162,\n \"mc2_stderr\": 0.01593030661874887\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39196360879454134,\n \"acc_stderr\": 0.013447140886023829\n }\n}\n```", "repo_url": "https://huggingface.co/vmajor/Orca2-13B-selfmerge-26B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T16-56-25.498405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["**/details_harness|winogrande|5_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T16-56-25.498405.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T16_56_25.498405", "path": ["results_2023-12-04T16-56-25.498405.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T16-56-25.498405.parquet"]}]}]}
2023-12-04T17:00:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-26B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-26B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T16:56:25.498405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-26B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-26B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:56:25.498405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-26B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-26B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T16:56:25.498405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-26B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-26B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T16:56:25.498405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
913d03af0a57969a793d26780731364224e618b3
# 音频数据集 ## 制作:[Xz乔希](https://space.bilibili.com/5859321) ## 注意 1、数据集均取自对应人物视频切片,声音版权归属于对应人物,早期质量一坨的就没上传; 2、音频仅进行分离人声及自动切片,未进行精选,请下载进行抽选试听后再考虑是否使用(弃用音频在手工标注时进行了跳过); 3、手工标注文件随机掉落(手工标注无法保证每一句都标的很标准,可以自行检查); 4、请在法律允许范围内进行测试使用!使用本数据集产生问题请自行承担!
XzJosh/audiodataset
[ "region:us" ]
2023-12-04T17:01:55+00:00
{}
2024-01-29T15:05:22+00:00
[]
[]
TAGS #region-us
# 音频数据集 ## 制作:Xz乔希 ## 注意 1、数据集均取自对应人物视频切片,声音版权归属于对应人物,早期质量一坨的就没上传; 2、音频仅进行分离人声及自动切片,未进行精选,请下载进行抽选试听后再考虑是否使用(弃用音频在手工标注时进行了跳过); 3、手工标注文件随机掉落(手工标注无法保证每一句都标的很标准,可以自行检查); 4、请在法律允许范围内进行测试使用!使用本数据集产生问题请自行承担!
[ "# 音频数据集", "## 制作:Xz乔希", "## 注意\n1、数据集均取自对应人物视频切片,声音版权归属于对应人物,早期质量一坨的就没上传;\n\n2、音频仅进行分离人声及自动切片,未进行精选,请下载进行抽选试听后再考虑是否使用(弃用音频在手工标注时进行了跳过);\n\n3、手工标注文件随机掉落(手工标注无法保证每一句都标的很标准,可以自行检查);\n\n4、请在法律允许范围内进行测试使用!使用本数据集产生问题请自行承担!" ]
[ "TAGS\n#region-us \n", "# 音频数据集", "## 制作:Xz乔希", "## 注意\n1、数据集均取自对应人物视频切片,声音版权归属于对应人物,早期质量一坨的就没上传;\n\n2、音频仅进行分离人声及自动切片,未进行精选,请下载进行抽选试听后再考虑是否使用(弃用音频在手工标注时进行了跳过);\n\n3、手工标注文件随机掉落(手工标注无法保证每一句都标的很标准,可以自行检查);\n\n4、请在法律允许范围内进行测试使用!使用本数据集产生问题请自行承担!" ]
[ 6, 6, 8, 126 ]
[ "passage: TAGS\n#region-us \n# 音频数据集## 制作:Xz乔希## 注意\n1、数据集均取自对应人物视频切片,声音版权归属于对应人物,早期质量一坨的就没上传;\n\n2、音频仅进行分离人声及自动切片,未进行精选,请下载进行抽选试听后再考虑是否使用(弃用音频在手工标注时进行了跳过);\n\n3、手工标注文件随机掉落(手工标注无法保证每一句都标的很标准,可以自行检查);\n\n4、请在法律允许范围内进行测试使用!使用本数据集产生问题请自行承担!" ]
db53e3ab9e242ca4449c0a9bd1f0cb6ba1da5fa8
# Corpus This dataset is built from Magicdata [ASR-CCHSHDIACSC: A CHINESE CHANGSHA DIALECT CONVERSATIONAL SPEECH CORPUS](https://magichub.com/datasets/changsha-dialect-conversational-speech-corpus/) This corpus is licensed under a [Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License](http://creativecommons.org/licenses/by-nc-nd/4.0/). Please refer to the license for further information. Modifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed. # Usage To load this dataset, use ```python from datasets import load_dataset dialect_corpus = load_dataset("TingChen-ppmc/Changsha_Dialect_Conversational_Speech_Corpus") ``` This dataset only has train split. To split out a test split, use ```python from datasets import load_dataset train_split = load_dataset("TingChen-ppmc/Changsha_Dialect_Conversational_Speech_Corpus", split="train") # where test=0.5 denotes 0.5 of the dataset will be split to test split corpus = train_split.train_test_split(test=0.5) ``` A sample data would be ```python # note this data is from the Nanchang Dialect corpus, the data format is shared {'audio': {'path': 'A0001_S001_0_G0001_0.WAV', 'array': array([-0.00030518, -0.00039673, -0.00036621, ..., -0.00064087, -0.00015259, -0.00042725]), 'sampling_rate': 16000}, 'gender': '女', 'speaker_id': 'G0001', 'transcription': '北京爱数智慧语音采集' } ``` [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
TingChen-ppmc/Changsha_Dialect_Conversational_Speech_Corpus
[ "region:us" ]
2023-12-04T17:02:15+00:00
{"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "gender", "dtype": "string"}, {"name": "speaker_id", "dtype": "string"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 223664136.256, "num_examples": 1488}], "download_size": 215320750, "dataset_size": 223664136.256}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-12-20T15:49:23+00:00
[]
[]
TAGS #region-us
# Corpus This dataset is built from Magicdata ASR-CCHSHDIACSC: A CHINESE CHANGSHA DIALECT CONVERSATIONAL SPEECH CORPUS This corpus is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please refer to the license for further information. Modifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed. # Usage To load this dataset, use This dataset only has train split. To split out a test split, use A sample data would be More Information needed
[ "# Corpus\n\nThis dataset is built from Magicdata ASR-CCHSHDIACSC: A CHINESE CHANGSHA DIALECT CONVERSATIONAL SPEECH CORPUS\n\nThis corpus is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please refer to the license for further information.\n\nModifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed.", "# Usage\n\nTo load this dataset, use\n\n\n\nThis dataset only has train split. To split out a test split, use\n\n\n\nA sample data would be\n\n\n\n\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Corpus\n\nThis dataset is built from Magicdata ASR-CCHSHDIACSC: A CHINESE CHANGSHA DIALECT CONVERSATIONAL SPEECH CORPUS\n\nThis corpus is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please refer to the license for further information.\n\nModifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed.", "# Usage\n\nTo load this dataset, use\n\n\n\nThis dataset only has train split. To split out a test split, use\n\n\n\nA sample data would be\n\n\n\n\n\nMore Information needed" ]
[ 6, 112, 34 ]
[ "passage: TAGS\n#region-us \n# Corpus\n\nThis dataset is built from Magicdata ASR-CCHSHDIACSC: A CHINESE CHANGSHA DIALECT CONVERSATIONAL SPEECH CORPUS\n\nThis corpus is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Please refer to the license for further information.\n\nModifications: The audio is split in sentences based on the time span on the transcription file. Sentences that span less than 1 second is discarded. Topics of conversation is removed.# Usage\n\nTo load this dataset, use\n\n\n\nThis dataset only has train split. To split out a test split, use\n\n\n\nA sample data would be\n\n\n\n\n\nMore Information needed" ]
7fc209523ae1d18cf7552c06de04044f9437d86e
# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-39B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/vmajor/Orca2-13B-selfmerge-39B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [vmajor/Orca2-13B-selfmerge-39B](https://huggingface.co/vmajor/Orca2-13B-selfmerge-39B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-39B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T17:00:35.598511](https://huggingface.co/datasets/open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-39B/blob/main/results_2023-12-04T17-00-35.598511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6021029441684177, "acc_stderr": 0.03292834355809297, "acc_norm": 0.6066088767121881, "acc_norm_stderr": 0.033597954121191174, "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5637680270447162, "mc2_stderr": 0.01593030661874887 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650649, "acc_norm": 0.6083617747440273, "acc_norm_stderr": 0.014264122124938217 }, "harness|hellaswag|10": { "acc": 0.611929894443338, "acc_stderr": 0.004863147544177514, "acc_norm": 0.7984465245966939, "acc_norm_stderr": 0.004003405481372169 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361073, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361073 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796005, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5491329479768786, "acc_stderr": 0.03794012674697031, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.03794012674697031 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.02494236893115979, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.02494236893115979 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7387096774193549, "acc_stderr": 0.02499305339776481, "acc_norm": 0.7387096774193549, "acc_norm_stderr": 0.02499305339776481 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03477691162163659, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270285, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270285 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397447, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397447 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5948717948717949, "acc_stderr": 0.02489047176993815, "acc_norm": 0.5948717948717949, "acc_norm_stderr": 0.02489047176993815 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066475, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066475 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200144, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200144 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.031493846709941306, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.031493846709941306 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384493, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384493 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7752234993614304, "acc_stderr": 0.01492744710193716, "acc_norm": 0.7752234993614304, "acc_norm_stderr": 0.01492744710193716 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647897, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.30837988826815643, "acc_stderr": 0.01544571691099888, "acc_norm": 0.30837988826815643, "acc_norm_stderr": 0.01544571691099888 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6881028938906752, "acc_stderr": 0.02631185807185416, "acc_norm": 0.6881028938906752, "acc_norm_stderr": 0.02631185807185416 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7129629629629629, "acc_stderr": 0.02517104191530968, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.02517104191530968 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.029700453247291484, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.029700453247291484 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4361147327249022, "acc_stderr": 0.012665568135455335, "acc_norm": 0.4361147327249022, "acc_norm_stderr": 0.012665568135455335 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5772058823529411, "acc_stderr": 0.030008562845003476, "acc_norm": 0.5772058823529411, "acc_norm_stderr": 0.030008562845003476 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6045751633986928, "acc_stderr": 0.019780465954777508, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.019780465954777508 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.401468788249694, "mc1_stderr": 0.017160273901693654, "mc2": 0.5637680270447162, "mc2_stderr": 0.01593030661874887 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 }, "harness|gsm8k|5": { "acc": 0.39196360879454134, "acc_stderr": 0.013447140886023829 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-39B
[ "region:us" ]
2023-12-04T17:03:31+00:00
{"pretty_name": "Evaluation run of vmajor/Orca2-13B-selfmerge-39B", "dataset_summary": "Dataset automatically created during the evaluation run of model [vmajor/Orca2-13B-selfmerge-39B](https://huggingface.co/vmajor/Orca2-13B-selfmerge-39B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-39B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:00:35.598511](https://huggingface.co/datasets/open-llm-leaderboard/details_vmajor__Orca2-13B-selfmerge-39B/blob/main/results_2023-12-04T17-00-35.598511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6021029441684177,\n \"acc_stderr\": 0.03292834355809297,\n \"acc_norm\": 0.6066088767121881,\n \"acc_norm_stderr\": 0.033597954121191174,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5637680270447162,\n \"mc2_stderr\": 0.01593030661874887\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938217\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.611929894443338,\n \"acc_stderr\": 0.004863147544177514,\n \"acc_norm\": 0.7984465245966939,\n \"acc_norm_stderr\": 0.004003405481372169\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397447,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.02489047176993815,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.02489047176993815\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193716,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193716\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n \"acc_stderr\": 0.01544571691099888,\n \"acc_norm\": 0.30837988826815643,\n \"acc_norm_stderr\": 0.01544571691099888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n \"acc_stderr\": 0.012665568135455335,\n \"acc_norm\": 0.4361147327249022,\n \"acc_norm_stderr\": 0.012665568135455335\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777508,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777508\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5637680270447162,\n \"mc2_stderr\": 0.01593030661874887\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39196360879454134,\n \"acc_stderr\": 0.013447140886023829\n }\n}\n```", "repo_url": "https://huggingface.co/vmajor/Orca2-13B-selfmerge-39B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-00-35.598511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["**/details_harness|winogrande|5_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-00-35.598511.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_00_35.598511", "path": ["results_2023-12-04T17-00-35.598511.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-00-35.598511.parquet"]}]}]}
2023-12-04T17:04:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-39B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-39B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T17:00:35.598511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-39B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-39B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:00:35.598511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-39B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-39B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:00:35.598511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 174, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vmajor/Orca2-13B-selfmerge-39B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vmajor/Orca2-13B-selfmerge-39B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T17:00:35.598511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
42581b1f992aa3aa2875e8a9a6039a3b170d22f0
# Dataset Card for Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/Galpaca-30b-MiniOrca - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/Galpaca-30b-MiniOrca](https://huggingface.co/KnutJaegersberg/Galpaca-30b-MiniOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Galpaca-30b-MiniOrca", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T17:03:58.676695](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Galpaca-30b-MiniOrca/blob/main/results_2023-12-04T17-03-58.676695.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4332138408065948, "acc_stderr": 0.034814041033432816, "acc_norm": 0.4404152598288463, "acc_norm_stderr": 0.035699301603131575, "mc1": 0.25458996328029376, "mc1_stderr": 0.015250117079156503, "mc2": 0.41103781665039923, "mc2_stderr": 0.014554592723373545 }, "harness|arc:challenge|25": { "acc": 0.44795221843003413, "acc_stderr": 0.01453201149821167, "acc_norm": 0.48890784982935154, "acc_norm_stderr": 0.014607794914013057 }, "harness|hellaswag|10": { "acc": 0.44164509061939855, "acc_stderr": 0.004955681533284337, "acc_norm": 0.5779725154351723, "acc_norm_stderr": 0.00492873510363583 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4934210526315789, "acc_stderr": 0.040685900502249704, "acc_norm": 0.4934210526315789, "acc_norm_stderr": 0.040685900502249704 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.49056603773584906, "acc_stderr": 0.030767394707808086, "acc_norm": 0.49056603773584906, "acc_norm_stderr": 0.030767394707808086 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.041666666666666644, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.041666666666666644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4393063583815029, "acc_stderr": 0.03784271932887467, "acc_norm": 0.4393063583815029, "acc_norm_stderr": 0.03784271932887467 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3235294117647059, "acc_stderr": 0.046550104113196177, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.046550104113196177 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4553191489361702, "acc_stderr": 0.032555253593403555, "acc_norm": 0.4553191489361702, "acc_norm_stderr": 0.032555253593403555 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939392, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939392 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4896551724137931, "acc_stderr": 0.041657747757287644, "acc_norm": 0.4896551724137931, "acc_norm_stderr": 0.041657747757287644 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3253968253968254, "acc_stderr": 0.02413015829976262, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.02413015829976262 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574925, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574925 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761005, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761005 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.4484848484848485, "acc_stderr": 0.03883565977956929, "acc_norm": 0.4484848484848485, "acc_norm_stderr": 0.03883565977956929 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5505050505050505, "acc_stderr": 0.03544132491947969, "acc_norm": 0.5505050505050505, "acc_norm_stderr": 0.03544132491947969 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.49740932642487046, "acc_stderr": 0.03608390745384487, "acc_norm": 0.49740932642487046, "acc_norm_stderr": 0.03608390745384487 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3923076923076923, "acc_stderr": 0.02475600038213095, "acc_norm": 0.3923076923076923, "acc_norm_stderr": 0.02475600038213095 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2222222222222222, "acc_stderr": 0.02534809746809784, "acc_norm": 0.2222222222222222, "acc_norm_stderr": 0.02534809746809784 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3907563025210084, "acc_stderr": 0.031693802357129965, "acc_norm": 0.3907563025210084, "acc_norm_stderr": 0.031693802357129965 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.5743119266055046, "acc_stderr": 0.0211992359724708, "acc_norm": 0.5743119266055046, "acc_norm_stderr": 0.0211992359724708 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.35784313725490197, "acc_stderr": 0.033644872860882996, "acc_norm": 0.35784313725490197, "acc_norm_stderr": 0.033644872860882996 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.569620253164557, "acc_stderr": 0.03223017195937597, "acc_norm": 0.569620253164557, "acc_norm_stderr": 0.03223017195937597 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5336322869955157, "acc_stderr": 0.03348180017060306, "acc_norm": 0.5336322869955157, "acc_norm_stderr": 0.03348180017060306 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.46564885496183206, "acc_stderr": 0.04374928560599738, "acc_norm": 0.46564885496183206, "acc_norm_stderr": 0.04374928560599738 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5537190082644629, "acc_stderr": 0.04537935177947879, "acc_norm": 0.5537190082644629, "acc_norm_stderr": 0.04537935177947879 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04803752235190193, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04803752235190193 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.44785276073619634, "acc_stderr": 0.03906947479456601, "acc_norm": 0.44785276073619634, "acc_norm_stderr": 0.03906947479456601 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.5145631067961165, "acc_stderr": 0.04948637324026637, "acc_norm": 0.5145631067961165, "acc_norm_stderr": 0.04948637324026637 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6111111111111112, "acc_stderr": 0.031937057262002924, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.031937057262002924 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.48020434227330777, "acc_stderr": 0.017865944827291626, "acc_norm": 0.48020434227330777, "acc_norm_stderr": 0.017865944827291626 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.4190751445086705, "acc_stderr": 0.02656417811142263, "acc_norm": 0.4190751445086705, "acc_norm_stderr": 0.02656417811142263 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23910614525139665, "acc_stderr": 0.014265554192331146, "acc_norm": 0.23910614525139665, "acc_norm_stderr": 0.014265554192331146 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5392156862745098, "acc_stderr": 0.028541722692618874, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.028541722692618874 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.43086816720257237, "acc_stderr": 0.028125340983972714, "acc_norm": 0.43086816720257237, "acc_norm_stderr": 0.028125340983972714 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4660493827160494, "acc_stderr": 0.027756535257347663, "acc_norm": 0.4660493827160494, "acc_norm_stderr": 0.027756535257347663 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3191489361702128, "acc_stderr": 0.0278079901413202, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.0278079901413202 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3344198174706649, "acc_stderr": 0.012049668983214933, "acc_norm": 0.3344198174706649, "acc_norm_stderr": 0.012049668983214933 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5, "acc_stderr": 0.030372836961539352, "acc_norm": 0.5, "acc_norm_stderr": 0.030372836961539352 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.45751633986928103, "acc_stderr": 0.020154685712590884, "acc_norm": 0.45751633986928103, "acc_norm_stderr": 0.020154685712590884 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.4818181818181818, "acc_stderr": 0.04785964010794917, "acc_norm": 0.4818181818181818, "acc_norm_stderr": 0.04785964010794917 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.39183673469387753, "acc_stderr": 0.03125127591089165, "acc_norm": 0.39183673469387753, "acc_norm_stderr": 0.03125127591089165 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5373134328358209, "acc_stderr": 0.03525675167467974, "acc_norm": 0.5373134328358209, "acc_norm_stderr": 0.03525675167467974 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-virology|5": { "acc": 0.45180722891566266, "acc_stderr": 0.03874371556587953, "acc_norm": 0.45180722891566266, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.38011695906432746, "acc_stderr": 0.037229657413855394, "acc_norm": 0.38011695906432746, "acc_norm_stderr": 0.037229657413855394 }, "harness|truthfulqa:mc|0": { "mc1": 0.25458996328029376, "mc1_stderr": 0.015250117079156503, "mc2": 0.41103781665039923, "mc2_stderr": 0.014554592723373545 }, "harness|winogrande|5": { "acc": 0.6006314127861089, "acc_stderr": 0.013764933546717614 }, "harness|gsm8k|5": { "acc": 0.01819560272934041, "acc_stderr": 0.0036816118940738735 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__Galpaca-30b-MiniOrca
[ "region:us" ]
2023-12-04T17:06:08+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Galpaca-30b-MiniOrca](https://huggingface.co/KnutJaegersberg/Galpaca-30b-MiniOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Galpaca-30b-MiniOrca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:03:58.676695](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Galpaca-30b-MiniOrca/blob/main/results_2023-12-04T17-03-58.676695.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4332138408065948,\n \"acc_stderr\": 0.034814041033432816,\n \"acc_norm\": 0.4404152598288463,\n \"acc_norm_stderr\": 0.035699301603131575,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156503,\n \"mc2\": 0.41103781665039923,\n \"mc2_stderr\": 0.014554592723373545\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.44795221843003413,\n \"acc_stderr\": 0.01453201149821167,\n \"acc_norm\": 0.48890784982935154,\n \"acc_norm_stderr\": 0.014607794914013057\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44164509061939855,\n \"acc_stderr\": 0.004955681533284337,\n \"acc_norm\": 0.5779725154351723,\n \"acc_norm_stderr\": 0.00492873510363583\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.030767394707808086,\n \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.030767394707808086\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.041666666666666644,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.041666666666666644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.02413015829976262,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.02413015829976262\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.03883565977956929,\n \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.03883565977956929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5505050505050505,\n \"acc_stderr\": 0.03544132491947969,\n \"acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.03544132491947969\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.49740932642487046,\n \"acc_stderr\": 0.03608390745384487,\n \"acc_norm\": 0.49740932642487046,\n \"acc_norm_stderr\": 0.03608390745384487\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3923076923076923,\n \"acc_stderr\": 0.02475600038213095,\n \"acc_norm\": 0.3923076923076923,\n \"acc_norm_stderr\": 0.02475600038213095\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02534809746809784,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02534809746809784\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5743119266055046,\n \"acc_stderr\": 0.0211992359724708,\n \"acc_norm\": 0.5743119266055046,\n \"acc_norm_stderr\": 0.0211992359724708\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.35784313725490197,\n \"acc_stderr\": 0.033644872860882996,\n \"acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.033644872860882996\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.569620253164557,\n \"acc_stderr\": 0.03223017195937597,\n \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.03223017195937597\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n \"acc_stderr\": 0.03348180017060306,\n \"acc_norm\": 0.5336322869955157,\n \"acc_norm_stderr\": 0.03348180017060306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.04537935177947879,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.04537935177947879\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456601,\n \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456601\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.04948637324026637,\n \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.04948637324026637\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.031937057262002924,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.031937057262002924\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.48020434227330777,\n \"acc_stderr\": 0.017865944827291626,\n \"acc_norm\": 0.48020434227330777,\n \"acc_norm_stderr\": 0.017865944827291626\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.02656417811142263,\n \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.02656417811142263\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331146,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331146\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43086816720257237,\n \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.43086816720257237,\n \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.027756535257347663,\n \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.027756535257347663\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.0278079901413202,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.0278079901413202\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3344198174706649,\n \"acc_stderr\": 0.012049668983214933,\n \"acc_norm\": 0.3344198174706649,\n \"acc_norm_stderr\": 0.012049668983214933\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.020154685712590884,\n \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.020154685712590884\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.03125127591089165,\n \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.03125127591089165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n \"acc_stderr\": 0.03525675167467974,\n \"acc_norm\": 0.5373134328358209,\n \"acc_norm_stderr\": 0.03525675167467974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156503,\n \"mc2\": 0.41103781665039923,\n \"mc2_stderr\": 0.014554592723373545\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6006314127861089,\n \"acc_stderr\": 0.013764933546717614\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.0036816118940738735\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Galpaca-30b-MiniOrca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-03-58.676695.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["**/details_harness|winogrande|5_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-03-58.676695.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_03_58.676695", "path": ["results_2023-12-04T17-03-58.676695.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-03-58.676695.parquet"]}]}]}
2023-12-04T17:07:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/Galpaca-30b-MiniOrca on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T17:03:58.676695(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Galpaca-30b-MiniOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:03:58.676695(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Galpaca-30b-MiniOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:03:58.676695(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 24, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Galpaca-30b-MiniOrca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Galpaca-30b-MiniOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T17:03:58.676695(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b7317c9f7a6ef08a84a5de2d4a15df9b44c19c5d
# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Ba2han/HermesStar-OrcaWind-Synth-11B](https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T17:07:13.650771](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B/blob/main/results_2023-12-04T17-07-13.650771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6532797554810081, "acc_stderr": 0.03195077848596749, "acc_norm": 0.6561799730695683, "acc_norm_stderr": 0.032588424826097764, "mc1": 0.31211750305997554, "mc1_stderr": 0.016220756769520932, "mc2": 0.48549947564682355, "mc2_stderr": 0.01511459123723551 }, "harness|arc:challenge|25": { "acc": 0.6126279863481229, "acc_stderr": 0.01423587248790987, "acc_norm": 0.6527303754266212, "acc_norm_stderr": 0.013913034529620448 }, "harness|hellaswag|10": { "acc": 0.644991037641904, "acc_stderr": 0.004775380866948017, "acc_norm": 0.836885082652858, "acc_norm_stderr": 0.0036871539405687955 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939098, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.038234289699266046, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.038234289699266046 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.035676037996391706, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.035676037996391706 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909284, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909284 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5317460317460317, "acc_stderr": 0.04463112720677173, "acc_norm": 0.5317460317460317, "acc_norm_stderr": 0.04463112720677173 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8064516129032258, "acc_stderr": 0.022475258525536057, "acc_norm": 0.8064516129032258, "acc_norm_stderr": 0.022475258525536057 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6871794871794872, "acc_stderr": 0.02350757902064536, "acc_norm": 0.6871794871794872, "acc_norm_stderr": 0.02350757902064536 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.029443169323031537, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.029443169323031537 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7016806722689075, "acc_stderr": 0.02971914287634285, "acc_norm": 0.7016806722689075, "acc_norm_stderr": 0.02971914287634285 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5601851851851852, "acc_stderr": 0.0338517797604481, "acc_norm": 0.5601851851851852, "acc_norm_stderr": 0.0338517797604481 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8312236286919831, "acc_stderr": 0.024381406832586223, "acc_norm": 0.8312236286919831, "acc_norm_stderr": 0.024381406832586223 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7130044843049327, "acc_stderr": 0.030360379710291947, "acc_norm": 0.7130044843049327, "acc_norm_stderr": 0.030360379710291947 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917671, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917671 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7668711656441718, "acc_stderr": 0.0332201579577674, "acc_norm": 0.7668711656441718, "acc_norm_stderr": 0.0332201579577674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.023948512905468365, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.023948512905468365 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3754189944134078, "acc_stderr": 0.01619510424846353, "acc_norm": 0.3754189944134078, "acc_norm_stderr": 0.01619510424846353 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7516339869281046, "acc_stderr": 0.02473998135511359, "acc_norm": 0.7516339869281046, "acc_norm_stderr": 0.02473998135511359 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.025006469755799208, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.025006469755799208 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46936114732724904, "acc_stderr": 0.012746237711716634, "acc_norm": 0.46936114732724904, "acc_norm_stderr": 0.012746237711716634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7022058823529411, "acc_stderr": 0.02777829870154544, "acc_norm": 0.7022058823529411, "acc_norm_stderr": 0.02777829870154544 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.01890101532209309, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.01890101532209309 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.028666857790274648, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.028666857790274648 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.31211750305997554, "mc1_stderr": 0.016220756769520932, "mc2": 0.48549947564682355, "mc2_stderr": 0.01511459123723551 }, "harness|winogrande|5": { "acc": 0.8011049723756906, "acc_stderr": 0.011218629972515319 }, "harness|gsm8k|5": { "acc": 0.5663381349507203, "acc_stderr": 0.013650728047064697 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B
[ "region:us" ]
2023-12-04T17:10:02+00:00
{"pretty_name": "Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Ba2han/HermesStar-OrcaWind-Synth-11B](https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T17:07:13.650771](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__HermesStar-OrcaWind-Synth-11B/blob/main/results_2023-12-04T17-07-13.650771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532797554810081,\n \"acc_stderr\": 0.03195077848596749,\n \"acc_norm\": 0.6561799730695683,\n \"acc_norm_stderr\": 0.032588424826097764,\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.48549947564682355,\n \"mc2_stderr\": 0.01511459123723551\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620448\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.644991037641904,\n \"acc_stderr\": 0.004775380866948017,\n \"acc_norm\": 0.836885082652858,\n \"acc_norm_stderr\": 0.0036871539405687955\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586223,\n \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586223\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520932,\n \"mc2\": 0.48549947564682355,\n \"mc2_stderr\": 0.01511459123723551\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515319\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064697\n }\n}\n```", "repo_url": "https://huggingface.co/Ba2han/HermesStar-OrcaWind-Synth-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["**/details_harness|winogrande|5_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T17-07-13.650771.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_04T17_07_13.650771", "path": ["results_2023-12-04T17-07-13.650771.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T17-07-13.650771.parquet"]}]}]}
2023-12-04T17:10:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Ba2han/HermesStar-OrcaWind-Synth-11B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T17:07:13.650771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Ba2han/HermesStar-OrcaWind-Synth-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:07:13.650771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Ba2han/HermesStar-OrcaWind-Synth-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T17:07:13.650771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 176, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Ba2han/HermesStar-OrcaWind-Synth-11B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Ba2han/HermesStar-OrcaWind-Synth-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T17:07:13.650771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]